0

I'd like to run multiple processes concurrently, but using Process I cannot limit the number of processes at a time, so that my computer becomes unusable for anything else. In my problem I have to run the main_function for all of the data in my_dataset. Here is a short sample of my code, is it possible to limit the number of processes at a time?

from multiprocessing import Process def my_function(my_dataset): processes = [] for data in my_dataset: transformed_data = transform(data) p = Process(target=main_function, args=(data, transformed_data)) p.start() processes.append(p) for p in processes: p.join() 
1

1 Answer 1

1

You can utilize the multiprocessing's Pool https://docs.python.org/3/library/multiprocessing.html#multiprocessing.pool.Pool

from multiprocessing import Pool names = ["Joe", "James", "Jimmy"] * 10 def print_name(name): print(f"Got Name: {name}") def runner(): p = Pool(4) p.map(print_name, names) if __name__== "__main__": runner() 
Sign up to request clarification or add additional context in comments.

3 Comments

add if __name__== "__main__": before runner() orelse his PC will crash
Depends where you're running it, I ran it on Jupyter notebook. but will add it for clarity
Just for completeness, the check is needed for Windows and OSX

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.