I have a large number of data files needed to be processed through a function A. Let say 1000 files, each process for each file takes less than 15 min with 6GB memory. My computer has 32GB and 8 cpus, so I can use maximum 4 processes (24GB mem and 4 cpus) a time for safety. So my question is that can I use multiprocess package in python to create 4 processes and each process continuously get function A to process a data file independently like the figure below. It is clearly that each cpu has to process approx. 250 files, but the file sizes of 1000 files are diferent then it is not necessarily true. One note that once a process is finished, then it assigned a new job immediately no matter what the other processes are finished or not, i.e there is no wait time for all four processes finished at the same time. The return of function A is not important here. Please provide the codes! Thank you for any suggestion.