gensim logo

gensim
gensim tagline

Get Expert Help

• machine learning, NLP, data mining

• custom SW design, development, optimizations

• corporate trainings & IT consulting

models.lsi_dispatcher – Dispatcher for distributed LSI

models.lsi_dispatcher – Dispatcher for distributed LSI

USAGE: %(program)s SIZE_OF_JOBS_QUEUE

Dispatcher process which orchestrates distributed LSI computations. Run this script only once, on any node in your cluster.

Example: python -m gensim.models.lsi_dispatcher

class gensim.models.lsi_dispatcher.Dispatcher(maxsize=0)

Dispatcher object that communicates and coordinates individual workers.

There should never be more than one dispatcher running at any one time.

Note that the constructor does not fully initialize the dispatcher; use the initialize() function to populate it with workers etc.

exit()

Terminate all registered workers and then the dispatcher.

getstate()

Merge projections from across all workers and return the final projection.

getworkers()

Return pyro URIs of all registered workers.

initialize(**model_params)

model_params are parameters used to initialize individual workers (gets handed all the way down to worker.initialize()).

jobdone(*args, **kwargs)

A worker has finished its job. Log this event and then asynchronously transfer control back to the worker.

In this way, control flow basically oscillates between dispatcher.jobdone() worker.requestjob().

jobsdone()

Wrap self._jobsdone, needed for remote access through proxies

reset()

Initialize all workers for a new decomposition.