Here's a python module in a python package:
import multiprocessing as mp class Test(object): def __init__(self): self.dict = dict() def fill_dict(self): self.dict = {'testing': 123} print self.dict if __name__ == "__main__": tests = [Test() for i in xrange(3)] jobs = [mp.Process(target=test.fill_dict, args=()) for test in tests] for job in jobs: job.start() for job in jobs: job.join() print "RESULTS:" for test in tests: print test.dict I run the module and get the results as follows:
C:\path\to\package>python -m package.path.to.module {'testing': 123} {'testing': 123} {'testing': 123} RESULTS: {} {} {} From the printout, it seems that each test.dict was filled in parallel by multiprocessing. However, when I try to recover the results, the test.dicts seems to be empty. Can someone explain why this is happening and what I can do to recover the nonempty test.dicts?
EDIT: @Valentin Lorentz, @Moinuddin Quadri, @zstewart
Even if I change the latter part of the module to
if __name__ == "__main__": test0 = Test() test1 = Test() test2 = Test() jobs = list() jobs.append(mp.Process(target=test0.fill_dict, args=())) jobs.append(mp.Process(target=test1.fill_dict, args=())) jobs.append(mp.Process(target=test2.fill_dict, args=())) for job in jobs: job.start() for job in jobs: job.join() print "RESULTS:" print test0.dict print test1.dict print test2.dict I get the same results. Each process is completely independent of the others (they don't share memory and don't need to). So when answers speak of sharing memory between processes, does this mean between the main module and the multiprocessing processes?