here is a grategreat article on how to makebuild stuff in parallel,
it uses multiprocessing.dummy to run things in different threads
here is a little example:
from urllib2 import urlopen from multiprocessing.dummy import Pool urls = [url_a, url_b, url_c ] pool = Pool() res = pool.map(urlopen, urls) pool.close() pool.join() for python >= 3.3 I suggest concurrent.futures
example:
import functools import urllib.request import futures URLS = ['http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/', 'http://www.bbc.co.uk/', 'http://some-made-up-domain.com/'] def load_url(url, timeout): return urllib.request.urlopen(url, timeout=timeout).read() with futures.ThreadPoolExecutor(50) as executor: future_list = executor.run_to_futures( [functools.partial(load_url, url, 30) for url in URLS]) example taken from: here