Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

3
  • See the accepted answer here: stackoverflow.com/questions/19102966/… Commented Feb 9, 2017 at 15:36
  • 2
    @AustinWagner By default, yes. HTTP throttling is part of the HTTP specification, so technically disabling (or relaxing) it is violating the specification. That said, we're living in different times - multiple concurrent requests aren't as bad as when HTTP was first designed. In any case, if you expect to get (significantly) rate limited, you might want to implement your own throttling as well anyway - otherwise you're just wasting a bunch of memory doing things in parallel when you could stream them instead - assuming you don't need all the responses at the same time, of course. Commented Feb 9, 2017 at 15:42
  • 6
    This worked, but with two caveats: 1. The connection limit @Luaan mentioned (worked around with ServicePointManager.DefaultConnectionLimit = 8) 2. The HttpClient timeout seems to start from when the request is queued, not when it's sent to the server. Though not optimal, I worked around this by just setting a long timeout. With this solution in place I now have an ~50 minute download process completing in ~5 minutes. Commented Feb 9, 2017 at 17:12