Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

5
  • One should consume HTTP response messages as a content stream without buffering the entire response body in memory.Then what should I do?entity.getContent? Thanks. Commented Mar 1, 2012 at 0:57
  • I doubt about that answer. All urls are diffrent,there are mills of them. Every thread opens a diffrent url. If url a denies me,why does url b deny me too? And I use IE and Firefox to open one of the url using the same proxy at the same time,and it's successful. So I think the logic is quite right. There's maybe sth I should do to clean the resources after I opened one url. Commented Mar 1, 2012 at 1:22
  • @Rusty: yes, you should be using InputStream returned by HttpEntity#getContent and reading only enough data to get the work done. Commented Mar 1, 2012 at 10:28
  • @Rusty: I suspect if you open ten instances of IE and script them to execute 1000 requests in a tight loop, you will start seeing 503 as well. I double-checked your code snippet (1) and could not spot any issues with resource deallocation. Commented Mar 1, 2012 at 10:31
  • :Thanks,pal.I finally figure out.I am being denied.In order not to be denied,I have to use the second way:new a httpclient in every loop of every thread.And the Java gc() is too slow,so i think sleep may be a good idea ,or i can reset the prog with a flag to to indicate progress.Thanks anyway Commented Mar 2, 2012 at 0:58