Skip to main content
2 of 3
added 304 characters in body
abarnert
  • 367.8k
  • 54
  • 626
  • 691

Like urllib2, requests is blocking.

It does have some async functionality, but that's not what I'd use here. And I wouldn't suggest using another library, either.

The simplest answer is to run each request in a separate thread. Unless you have hundreds of them, this should be fine. (How many hundreds is too many depends on your platform. On Windows, the limit is probably how much memory you have for thread stacks; on most other platforms the cutoff comes earlier.)

If you do have hundreds, you can put them in a threadpool. The ThreadPoolExecutor Example in the concurrent.futures page is almost exactly what you need; just change the urllib calls to requests calls. (If you're on 2.x, use futures, the backport of the same packages on PyPI.) The downside is that you don't actually kick off all 1000 requests at once, just the first, say, 8.

If you have hundreds, and they all need to be in parallel, this sounds like a job for gevent. Have it monkeypatch everything, then write the exact same code you'd write with threads, but spawning greenlets instead of Threads.

abarnert
  • 367.8k
  • 54
  • 626
  • 691