2

I have extensive experience with PHP cURL but for the last few months I've been coding primarily in Java, utilizing the HttpClient library.

My new project requires me to use Python, once again putting me at the crossroads of seemingly comparable libraries: pycurl and urllib2.

Putting aside my previous experience with PHP cURL, what is the recommended library in Python? Is there a reason to use one but not the other? Which is the more popular option?

2
  • 1
    Always go with the standard library if you can get away with it. Less hassle and increased portability! Commented Jan 23, 2010 at 3:57
  • See short and clear explanation stackoverflow.com/questions/2385855/… Commented Jan 16, 2014 at 7:57

4 Answers 4

9

CURL has a lot more features as stated in its web page, so if you need, say fast concurrent connections, safe threading, etc then its for you. However, its not included in the distribution. If you foresee that your task is very simple, then use urllib2 and those HTTP modules that come with the distribution.

Sign up to request clarification or add additional context in comments.

Comments

3

urllib2 is part of the standard library, pycurl isn't (so it requires a separate step of download/install/package etc). That alone, quite apart from any difference in intrinsic quality, is guaranteed to make urllib2 more popular (and can be a pretty good pragmatical reason to pick it -- convenience!-).

1 Comment

Fortunately getting pycurl installed was as simple as: sudo apt-get install pycurl :)
3

Nowadays there are other excellent alternatives - urllib3 and requests

Comments

1

Use urllib2. It's got very good documentation in python, while pycurl is mostly C documentation. If you hit a wall, switch to mechanize or pycurl.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.