gensim logo

gensim
gensim tagline

Get Expert Help

• machine learning, NLP, data mining

• custom SW design, development, optimizations

• corporate trainings & IT consulting

models.lda_dispatcher – Dispatcher for distributed LDA

models.lda_dispatcher – Dispatcher for distributed LDA

USAGE: %(program)s SIZE_OF_JOBS_QUEUE

Dispatcher process which orchestrates distributed LDA computations. Run this script only once, on any node in your cluster.

Example: python -m gensim.models.lda_dispatcher

class gensim.models.lda_dispatcher.Dispatcher(maxsize=10)

Dispatcher object that communicates and coordinates individual workers.

There should never be more than one dispatcher running at any one time.

Note that the constructor does not fully initialize the dispatcher; use the initialize() function to populate it with workers etc.

exit()

Terminate all registered workers and then the dispatcher.

getstate()

Merge states from across all workers and return the result.

getworkers()

Return pyro URIs of all registered workers.

initialize(**model_params)

model_params are parameters used to initialize individual workers (gets handed all the way down to worker.initialize()).

jobdone(*args, **kwargs)

A worker has finished its job. Log this event and then asynchronously transfer control back to the worker.

In this way, control flow basically oscillates between dispatcher.jobdone() and worker.requestjob().

jobsdone()

Wrap self._jobsdone, needed for remote access through Pyro proxies

reset(state)

Initialize all workers for a new EM iterations.