2

Here's a python module in a python package:

import multiprocessing as mp class Test(object): def __init__(self): self.dict = dict() def fill_dict(self): self.dict = {'testing': 123} print self.dict if __name__ == "__main__": tests = [Test() for i in xrange(3)] jobs = [mp.Process(target=test.fill_dict, args=()) for test in tests] for job in jobs: job.start() for job in jobs: job.join() print "RESULTS:" for test in tests: print test.dict 

I run the module and get the results as follows:

C:\path\to\package>python -m package.path.to.module {'testing': 123} {'testing': 123} {'testing': 123} RESULTS: {} {} {} 

From the printout, it seems that each test.dict was filled in parallel by multiprocessing. However, when I try to recover the results, the test.dicts seems to be empty. Can someone explain why this is happening and what I can do to recover the nonempty test.dicts?

EDIT: @Valentin Lorentz, @Moinuddin Quadri, @zstewart

Even if I change the latter part of the module to

if __name__ == "__main__": test0 = Test() test1 = Test() test2 = Test() jobs = list() jobs.append(mp.Process(target=test0.fill_dict, args=())) jobs.append(mp.Process(target=test1.fill_dict, args=())) jobs.append(mp.Process(target=test2.fill_dict, args=())) for job in jobs: job.start() for job in jobs: job.join() print "RESULTS:" print test0.dict print test1.dict print test2.dict 

I get the same results. Each process is completely independent of the others (they don't share memory and don't need to). So when answers speak of sharing memory between processes, does this mean between the main module and the multiprocessing processes?

1
  • 1
    Multiprocessing launches separate python processes. You have to use a multiprocessing.queue to send data between the processes. Commented Dec 18, 2016 at 17:31

2 Answers 2

1

You can not transmit data between threads using built-in types. You need to use multiprocessing.Queue or mutiprocess.Pipes to perform that. Check: Exchanging objects between processes.

You may refer: Running multiple asynchronous function and get the returned value of each function. It is having the example on how to use Queue() instead of list.

Sign up to request clarification or add additional context in comments.

Comments

1

Processes do not share memory by default. That's on purpose, because sharing memory between execution threads is a very complicated topic (synchronization between threads, communications, ...)

The most direct solution to your problem is to explicitely share this object: https://docs.python.org/3/library/multiprocessing.html#sharing-state-between-processes

However, a cleaner way is to make your function return a result, and use multiprocessing.Pool.map to call the function in multiple processes.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.