I am writing a web scraping application in Python. The website I am scraping has urls of the form www.someurl.com/getPage?id=x where x is a number identifying the page. Now, I am downloading all the pages using urlretrieve
Here is the basic form of my script:
for i in range(1,1001): urlretrieve('http://someurl.com/getPage?id='+str(i) , str(i)+".html) Now, my question - is it possible to download the pages simultaneously? Because, here I am blocking the script and waiting for the page to download. Can I ask Python to open more than one connection to the server?