In windows, there must be a check if the process is main before multiprocessing can be used, otherwise there will be an infinite loop.
I tried to change the name of the process to the name of the subprocess to use multiprocessing from within a class or function that I call, but no luck. Is this even possible? Up to date I failed to use multiprocessing, unless it was using the main process.
if it is possible, could someone provide a example on how to use multiprocessing within a class or function that is being called from a higher process? Thanks.
Edit:
Here is an Example - the first one works, but everything is done in 1 file: simplemtexample3.py:
import random import multiprocessing import math def mp_factorizer(nums, nprocs): #schtze den prozess #print __name__ if __name__ == '__main__': out_q = multiprocessing.Queue() chunksize = int(math.ceil(len(nums) / float(nprocs))) procs = [] for i in range(nprocs): p = multiprocessing.Process( target=worker, args=(nums[chunksize * i:chunksize * (i + 1)], out_q)) procs.append(p) p.start() # Collect all results into a single result dict. We know how many dicts # with results to expect. resultlist = [] for i in range(nprocs): temp=out_q.get() index =0 #print temp for i in temp: resultlist.append(temp[index][0][0:]) index +=1 # Wait for all worker processes to finish for p in procs: p.join() resultlist2 = [x for x in resultlist if x != []] return resultlist2 def worker(nums, out_q): """ The worker function, invoked in a process. 'nums' is a list of numbers to factor. The results are placed in a dictionary that's pushed to a queue. """ outlist = [] for n in nums: newnumber= n*2 newnumberasstring = str(newnumber) if newnumber: outlist.append(newnumberasstring) out_q.put(outlist) l = [] for i in range(80): l.append(random.randint(1,8)) print mp_factorizer(l, 4) However, when I try to call mp_factorizer from another file, it does not work because of the if __name__ == '__main__':
simplemtexample.py
import random import multiprocessing import math def mp_factorizer(nums, nprocs): #schtze den prozess #print __name__ if __name__ == '__main__': out_q = multiprocessing.Queue() chunksize = int(math.ceil(len(nums) / float(nprocs))) procs = [] for i in range(nprocs): p = multiprocessing.Process( target=worker, args=(nums[chunksize * i:chunksize * (i + 1)], out_q)) procs.append(p) p.start() # Collect all results into a single result dict. We know how many dicts # with results to expect. resultlist = [] for i in range(nprocs): temp=out_q.get() index =0 #print temp for i in temp: resultlist.append(temp[index][0][0:]) index +=1 # Wait for all worker processes to finish for p in procs: p.join() resultlist2 = [x for x in resultlist if x != []] return resultlist2 def worker(nums, out_q): """ The worker function, invoked in a process. 'nums' is a list of numbers to factor. The results are placed in a dictionary that's pushed to a queue. """ outlist = [] for n in nums: newnumber= n*2 newnumberasstring = str(newnumber) if newnumber: outlist.append(newnumberasstring) out_q.put(outlist) startsimplemtexample.py
import simplemtexample as smt import random l = [] for i in range(80): l.append(random.randint(1,8)) print smt.mp_factorizer(l, 4)