Skip to main content

Timeline for Python Urllib UrlOpen Read

Current License: CC BY-SA 3.0

6 events
when toggle format what by license comment
Sep 12, 2013 at 21:32 comment added B.Mr.W. I did a small experiment and record the total time to scrape 100 URLs with different numbers of threads. Result is pretty interesting and I would try the multiprocessing library sometime and update my post. Thanks a lot for your explanation.
Sep 12, 2013 at 20:51 vote accept B.Mr.W.
Sep 12, 2013 at 20:48 history edited miku CC BY-SA 3.0
added 191 characters in body
Sep 12, 2013 at 20:43 comment added miku In IO-bound cases, you can use both. For CPU-bound tasks, multiprocessing will utilize all available core, while threading will run on a single core due to the GIL.
Sep 12, 2013 at 20:37 comment added B.Mr.W. Actually, it is all pointed to the same server, I am not quite sure what is the true difference between the package of Threading and Multiprocessing in this case. "The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads. Due to this, the multiprocessing module allows the programmer to fully leverage multiple processors on a given machine". Does that mean python is actually only using one processor or ...
Sep 12, 2013 at 20:31 history answered miku CC BY-SA 3.0