I am trying to use the python multiprocessing library in order to parallize a task I am working on:
import multiprocessing as MP def myFunction((x,y,z)): ...create a sqlite3 database specific to x,y,z ...write to the database (one DB per process) y = 'somestring' z = <large read-only global dictionary to be shared> jobs = [] for x in X: jobs.append((x,y,z,)) pool = MP.Pool(processes=16) pool.map(myFunction,jobs) pool.close() pool.join() Sixteen processes are started as seen in htop, however no errors are returned, no files written, no CPU is used.
Could it happen that there is an error in myFunction that is not reported to STDOUT and blocks execution?
Perhaps it is relevant that the python script is called from a bash script running in background.
def myFunction((x,y,z)):are the extra parenthesis on purpose?import multiprocessing.dummy as MPinstead ofimport multiprocessing as MP?dummyuses threads instead of actual processes which helps with debugging a.