Skip to content Skip to sidebar Skip to footer

Share Same Multiprocessing.pool Object Between Different Python Instances

In Python 3 I need to have a Pool of processes in which, asynchronously, apply multiple workers. The problem is that I need to 'send' workers to the Pool from a series of separate

Solution 1:

In the end I was able to code a working basic example using Python 3 BaseManager. See docs here.

In a script called server.py:

jobs = multiprocessing.Manager().Queue()
BaseManager.register('JobsQueue', callable = lambda: jobs)
m = BaseManager(address=('localhost', 55555), authkey=b'myauthkey')
s = m.get_server()
s.serve_forever()

Then in one or more scripts client.py:

BaseManager.register('JobsQueue') # See the difference with the server!
m = BaseManager(address=('localhost', 55555), authkey=b'myauthkey') # Use same authkey! It may work remotely too...
m.connect()
# Then you can put data in the queue
q = m.JobsQueue()
q.put("MY DATA HERE")
# or also
data = q.get()
# etc etc...

Obviously this is a basic example, but I think it offers to do a lot of complex work without using external libraries.

A lot of people today look to a ready to use, often massive weight, library or software, without understand the basics. I'm not one of them...

Cheers

Post a Comment for "Share Same Multiprocessing.pool Object Between Different Python Instances"