All github action tests of Pywikibot fails due to 429 status code (TOO MANY REQUESTS)
Open, Needs TriagePublic

Description

Since yesterday all github action tests of pywikibot-ci and windows fails due to 429 status code (TOO MANY REQUESTS). The http response text is like this::

Non-JSON response received from server wikipedia:en for url https://en.wikipedia.org/w/api.php The server may be down. Status code: 429 The text message is: Wikimedia Error Error Too many requests (f061ab2) If you report this error to the Wikimedia System Administrators, please include the details below.Request served via cp3066 cp3066, Varnish XID 373424790 Upstream caches: cp3066 intError: 429, Too many requests (f061ab2) at Fri, 09 Jan 2026 10:01:41 GMTSensitive client informationIP address: 93.207.188.83 ERROR

Here some of the details

Here is a sample of the received headers:

{'content-length': '1965', 'content-type': 'text/html; charset=utf-8', 'date': 'Fri, 09 Jan 2026 11:10:06 GMT', 'nel': '{ "report_to": "wm_nel", "max_age": 604800, "failure_fraction": 0.05, ' '"success_fraction": 0.0}', 'report-to': '{ "group": "wm_nel", "max_age": 604800, "endpoints": [{ "url": ' '"https://intake-logging.wikimedia.org/v1/events?stream=w3c.reportingapi.network_error&schema_uri=/w3c/reportingapi/network_error/1.0.0" ' '}] }', 'retry-after': '1', 'server': 'Varnish', 'server-timing': 'cache;desc="int-front", host;desc="cp3066"', 'strict-transport-security': 'max-age=106384710; includeSubDomains; preload', 'x-analytics': '', 'x-cache': 'cp3066 int', 'x-cache-status': 'int-front', 'x-client-ip': '93.207.188.83', 'x-request-id': '902116cb-5506-4037-ab57-d98bb5246ad6'}

Details

Related Changes in Gerrit:

Event Timeline

Xqt renamed this task from All github action tests of Pywikibot fauls due to 429 status code (TOO MANY REQUESTS) to All github action tests of Pywikibot fails due to 429 status code (TOO MANY REQUESTS).Jan 9 2026, 10:45 AM

What user agent are you using?

I'm not sure having your CI depend on external resources is a good policy; I encourage you to change that long-term, but anyways, we don't want to block the work on pywikibot right now.

If you set your user agent in CI to one that respects our UA policy, that should already provide a higher rate-limit.

If that's not enough, we can discuss granting a time-limited exception while you revise how your CI works (and stop depending on production API for it).

What user agent are you using?

pwb.py version for a sample test taks gives

Pywikibot: [https] wikimedia-pywikibot (ecd9fbc, g1, 2026/01/09, 11:08:22, master) Release version: 11.0.0.dev10 packaging version: 25.0 mwparserfromhell version: 0.7.2 wikitextparser version: 0.56.4 requests version: 2.32.5 cacerts: /opt/hostedtoolcache/Python/3.9.25/x64/lib/python3.9/site-packages/certifi/cacert.pem certificate test: ok Python: 3.9.25 (main, Nov 3 2025, 15:16:36) [GCC 13.3.0] User-Agent: version (wikipedia:en; User:Pywikibot-test) Pywikibot/11.0.0.dev10 (g1) requests/2.32.5 Python/3.9.25.final.0 PYWIKIBOT_DIR: Not set PYWIKIBOT_DIR_PWB: /home/runner/work/pywikibot/pywikibot/pywikibot/scripts PYWIKIBOT_NO_USER_CONFIG: Not set PYWIKIBOT_TEST_NO_RC: 0 PYWIKIBOT_TEST_RUNNING: 1 PYWIKIBOT_USERNAME: Pywikibot-test Config base dir: /home/runner/work/pywikibot/pywikibot

The format string looks like
user_agent_format = ('{script_product} ({script_comments}) {pwb} ({revision}) {http_backend} {python}') with

USER_AGENT_PRODUCTS = { 'python': 'Python/' + '.'.join(str(i) for i in sys.version_info), 'http_backend': 'requests/' + requests.__version__, 'pwb': 'Pywikibot/' + pywikibot.__version__, }

See the function interface and the source

I have the same problem with a local account which this UA:
version (wikipedia:de; User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0
`

I also get that blocker for my normal bot during running redirect.py script:

>>> Talk:Licentiate (Pontifical Degree) <<< Links to: [[en:Talk:Licentiate (pontifical degree)]]. Skipping: Redirect target [[en:Talk:Licentiate (pontifical degree)]] is not a redirect. .WARNING: Http response status 429 WARNING: Non-JSON response received from server wikipedia:en for url https://en.wikipedia.org/w/api.php The server may be down. Status code: 429 The text message is: Wikimedia Error Error Too many requests (f061ab2) If you report this error to the Wikimedia System Administrators, please include the details below.Request served via cp3067 cp3067, Varnish XID 58626031Upstream caches: cp3067 intError: 429, Too many requests (f061ab2) at Fri, 09 Jan 2026 12:06:32 GMTSensitive client informationIP address: 217.86.198.228 Set geilimit = ['2500'] WARNING: Waiting 5.0 seconds before retrying. WARNING: Http response status 403 Set geilimit = ['1250'] WARNING: Waiting 10.0 seconds before retrying.

Also for other scripts. The UA looks like this
User-Agent: version (wikipedia:de; User:Xqbot) Pywikibot/11.0.0.dev10 (g20137) requests/2.32.5 Python/3.11.2.final.0

This user agent is not compliat with our user-agent policy:

https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy

which requires you to provide either an email or a specific url for your bot.

If you change that, it should allow you a higher rate-limit.

This user agent is not compliat with our user-agent policy:

https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy

which requires you to provide either an email or a specific url for your bot.

If you change that, it should allow you a higher rate-limit.

Pywikibot's UA wasn't changed for 12 years but I'll try to fix it. Where can I check whether the new format string matches?

This user agent is not compliat with our user-agent policy:

https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy

which requires you to provide either an email or a specific url for your bot.

If you change that, it should allow you a higher rate-limit.

Pywikibot's UA wasn't changed for 12 years but I'll try to fix it. Where can I check whether the new format string matches?

Hi @Xqt , the page linked above (https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy) provides some examples of "working" User-Agent headers, specifically the required format is:

<client name>/<version> (<contact information>) <library/framework name>/<version> [<library name>/<version> ...]

@Fabfur: Currently we have this UA:
version (wikipedia:de; User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0
Would this be appropriate:
version (https://de.wikipedia:org/User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0

@Fabfur: Currently we have this UA:
version (wikipedia:de; User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0
Would this be appropriate:
version (https://de.wikipedia:org/User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0

I think this is better, if you want to add an email as contact you can do it right after the URL, separating the two with a ;.
I'd also replace the version string with something more meaningful that can identify your work/bot for future reference

I think this is better, if you want to add an email as contact you can do it right after the URL, separating the two with a ;.

xx.wikipedia.org/wiki/User:Botname can be easily obtainet from current user configuration. There is nowhere operators email and this framework is used by hundrets of users. This bug/change is breaking most bots

I think this is better, if you want to add an email as contact you can do it right after the URL, separating the two with a ;.

xx.wikipedia.org/wiki/User:Botname can be easily obtainet from current user configuration. There is nowhere operators email and this framework is used by hundrets of users. This bug/change is breaking most bots

An email address is not mandatory for this, the URL is sufficient, I was only suggesting the string format in case

@Joe, @Tgr: Could you please consider postponing the newly introduced restriction until the Pywikibot User-Agent has been updated? As far as I know, the current implementation breaks a large number of bots that rely on it. Postponing the restriction for a short period would allow us maintainers to fix the User-Agent without causing widespread disruptions. Thank you for your consideration.

@Fabfur: Currently we have this UA:
version (wikipedia:de; User:Xqtest) Pywikibot/11.0.0.dev10 (g20136) requests/2.32.5 Python/3.13.0.final.0

IMO wikipedia:de, User:Xqtest part is sufficient enough to identify the user: it pinpoints to de.wikipedia.org/wiki/User:Xqtest which can be quickly contacted via talk page or Special:EmailUser if necessary. That is identifying enough…

The policy says:

If you run a bot, please send a User-Agent header identifying the bot with an identifier that isn't going to be confused with many other bots, and supplying some way of contacting you (e.g. a userpage on the local wiki, a userpage on a related wiki using interwiki linking syntax, a URI for a relevant external website, or an email address)

“Send a User-Agent header identifying the bot … and supplying some way of contacting you (e.g. a userpage on the local wiki, …)” -> the userpage should be sufficient enough, most of the time the bot operator is required to have a userpage linking their bot account with the main account.

@Joe, @Tgr: Could you please consider postponing the newly introduced restriction until the Pywikibot User-Agent has been updated? As far as I know, the current implementation breaks a large number of bots that rely on it. Postponing the restriction for a short period would allow us maintainers to fix the User-Agent without causing widespread disruptions. Thank you for your consideration.

You mean that users of pywikibot can't set the user-agent to a string they prefer? I assumed it was always possible to overwrite the UA string in pywikibot, per https://www.mediawiki.org/wiki/Manual:Pywikibot/User-agent

To clarify:

  • Users on toolsforge or cloud VPS are exempt from the limit
  • I only see about 5% of all requests with UA containing User:XXX being rate-limited

But I agree with @revi's point that the string respects the spirit if not the letter of the policy. We'll try to add a patch to consider that kind of strings identifying information.

This will mean that your bots will have a higher rate limit, in accordance with the Robot Policy.

Change #1224977 had a related patch set uploaded (by Fabfur; author: Fabfur):

[operations/puppet@production] cache:haproxy: add new contact type

https://gerrit.wikimedia.org/r/1224977

Change #1224977 merged by Fabfur:

[operations/puppet@production] cache:haproxy: add new contact type

https://gerrit.wikimedia.org/r/1224977

We're now allowing this new type of contact information in User-Agent string, this change should be propagated shortly. Please notify us on this ticket if the situation remains unchanged

Maybe it's somehow related. I use siddharthvp/mwn for deploying my gadget. But a couple of days ago my deploy started failing due to "ratelimited", even though my queue is less than 50 edits.

image.png (835×2 px, 170 KB)

I have used this user agent before and it works fine:

InstantDiffsDeployScript/5.0 (https://www.mediawiki.org/wiki/Instant_Diffs)

Even if I add an email, I get the same failed results:

InstantDiffsDeployScript/5.0 (https://www.mediawiki.org/wiki/Instant_Diffs; serdidg@gmail.com)

Maybe it's somehow related. I use siddharthvp/mwn for deploying my gadget. But a couple of days ago my deploy started failing due to "ratelimited", even though my queue is less than 50 edits.

The error you see is from MediaWiki-level API rate limiting and thus is completely unrelated from the CDN-level rate limits that this task is about.

@Fabfur: I still have the 429 problem with my bot. Here the html content:

WARNING: Http response status 429 WARNING: Non-JSON response received from server wikipedia:de for url https://de.wikipedia.org/w/api.php The server may be down. Status code: 429 User agent: Xqbot/0.0 (https://de.wikipedia.org/xqbot/; info@gno.de) Pywikibot/11.0 The text message is: Wikimedia Error Error Your bot is making too many requests. Please reduce your request rate or contact bot-traffic@wikimedia.org (f263c81) If you report this error to the Wikimedia System Administrators, please include the details below.Request served via cp3067 cp3067, Varnish XID 996851931Upstream caches: cp3067 intError: 429, Your bot is making too many requests. Please reduce your request rate or contact bot-traffic@wikimedia.org (f263c81) at Sat, 10 Jan 2026 13:50:54 GMTSensitive client informationIP address: 217.86.198.228
Joe assigned this task to Fabfur.

@Fabfur: I still have the 429 problem with my bot. Here the html content:

Hi, this is by design. You are making more than the maximum limit of requests allowed to non-identified bots according to the Robot Policy; actually, the code allows you to make 100 non-cached requests over 10 seconds; if you exceed that limit, it will require you to back off for 10 seconds. If your bot respects the Retry-After http header, you should be able to only get rate-limited on occasion.

If you need to make more requests, you have three options to avoid these limits:

  • Run your bot from toolsforge
  • Authenticate your requests, if your bot has the bot flag set on some wiki
  • Follow the steps indicated here

In your case, I would say the second option is the better one.

I assume from your answer that the issue with the CI for pywikibot is fixed. Therefore I will resolve this task.

@Fabfur any idea why my pywikibot script is getting 429 errors every 200 pages it's pulling down? It pulls down about 7 pages a second, which is what pywikibot seems limited to. This is not a lot of pages. every 200 pages i get this:

WARNING: Http response status 429 ERROR: Traceback (most recent call last): File "/opt/homebrew/lib/python3.11/site-packages/pywikibot/data/api/_requests.py", line 702, in _http_request response = http.request(self.site, uri=uri, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/pywikibot/comms/http.py", line 289, in request site.throttle.retry_after = int(r.headers.get('retry-after', 0)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ValueError: invalid literal for int() with base 10: '11.000' WARNING: Waiting 5.0 seconds before retrying. WARNING: Http response status 429 ERROR: Traceback (most recent call last): File "/opt/homebrew/lib/python3.11/site-packages/pywikibot/data/api/_requests.py", line 702, in _http_request response = http.request(self.site, uri=uri, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/pywikibot/comms/http.py", line 289, in request site.throttle.retry_after = int(r.headers.get('retry-after', 0)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ValueError: invalid literal for int() with base 10: '11.000' WARNING: Waiting 10.0 seconds before retrying.

i dunno why pywikibot is having issues with retry-after or why it's ending up as a float. My bot has its user-agent set to include my email and I'm using a bot account on enwiktionary that has the bot flag set. This only started happening yesterday.

But I agree with @revi's point that the string respects the spirit if not the letter of the policy. We'll try to add a patch to consider that kind of strings identifying information.

It's easy to automatically detect whether a UA has an URL or an email address but impossible to reliably detect if it "respects the spirit of the policy". It makes sense to add unofficial exceptions for widely used existing clients, but IMO policy-wise it would be better to make URL or email a clearer requirement.

@Benwing2 - Thanks for calling our attention to the Retry-After response header format issue. We've made a change that we believe should address this, which should be live everywhere as of roughly 20:30 UTC today. Please let us know if you continue to see the unexpected float-like format.

This is not solved yet for Pywikibot tests. A significant number of tests are still failing, and I have not been able to find a short-term workaround. It is evident that read request throttle is required for any bot running outside Toolforge. Additionally, the current Retry-After value of 1 second (see header details in this task) does not seem to be meaningful or sufficient.

See failing CI runs:

Please note that this change also affects bot owners, not only CI. Even a 5% reduction in allowed edits can be significant for some production bots.

i dunno why pywikibot is having issues with retry-after or why it's ending up as a float. My bot has its user-agent set to include my email and I'm using a bot account on enwiktionary that has the bot flag set. This only started happening yesterday.

@Benwing2: The breaking retry-after change of the backend has been solve on master branch already, see T414197.

@Xqt we're rolling out a change that should lift the current ratelimiting and impact Pywikibot too, could you please check in ~30 minutes if you still see the same amount of errors?
Thanks

@Fabfur: I can’t reproduce this issue locally, but it still occurs in the Pywikibot tests, though less frequently, see https://github.com/wikimedia/pywikibot/actions/runs/20957207898/job/60237399808 for example. I guess this is caused by a missing user name in UA through. This can happen if there is no username given within user-config.py. The UA will be improved with T414201 or a subtask thereof.

I found examples with valid UA which triggers the 429 error code in
https://github.com/wikimedia/pywikibot/actions/runs/20957207898/job/60237399894

For my local bot I use an URL instead of (<site>; User:<user>); Was this patch also reverted?

Hello,

I am encountering the same error with my bot, based on Pywikibot...
I am still unable to fetch more than a few pages from Wikipedia. The requests are authenticated. The bot has been running for about 13 years but most of my tasks are now blocked...
I added a "user_agent_description" parameter in user-config.py (and also edited pywikibot/comms/http.py to add the "/1.0" string) to customize the user agent.
I currently used the following string:

customscript/1.0 (https://www.wikidata.org/wiki/User:Peter17-Bot; peter017+bot@gmail.com; wikipedia:arz) Pywikibot/10.7.4 (-1 (unknown)) requests/2.32.3 Python/3.10.18.final.0

Can you please explain what I have to change to comply with https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy ?

Thanks in advance. Best regards

I've been having the same issue this week. My script doesn't edit wikidata at all, just pulls demographic information about various entities. It was working previously, but started failing this week. I've now updated to the new version of pywikibot and ensured my UA is up-to-date re: the policy, but am still running into the 429 error after a few requests. Any help is appreciated!

FWIW I have not been having this issue recently with enwiktionary at least. My Pywikibot settings look like

user_agent_format = '{script_product} ({script_comments}) {pwb} ({revision}) {http_backend} {python}' user_agent_description = 'WingerBot/1.0 (https://XXXXXX.com; ben@XXXXXX.com)'

where XXXXXX is my own domain. I don't know if having your own domain matters.

@Peter17, @Lupascriptix, @Benwing2: Do your issues involve GitHub action tests? What is the full error message output?

Hi, No to the Github action tests- I'm sorry, I thought this was the right place to put this post, but I can make a new task and delete my previous comment if that's a better way to do it.

The error message I'm getting is

WARNING: Http response status 429 WARNING: Non-JSON response received from server wikidata:wikidata for url https://www.wikidata.org/w/api.php?ids=Q6581097&action=wbgetentities&maxlag=5&format=json The server may be down. Status code: 429

The text message is:

Wikimedia Error
Error
Too many requests (f061ab2)
If you report this error to the Wikimedia System Administrators, please include the details below.
Request served via cp1114 cp1114, Varnish XID 222213378Upstream caches: cp1114 intError: 429, Too many requests (f061ab2) at Fri, 30 Jan 2026 03:34:27 GMT

Hi, No to the Github action tests- I'm sorry, I thought this was the right place to put this post

This ticket is only about Github action tests of Pywikibot.
If you get Error: 429, Too many requests, then I'd recommend taking a look at https://www.mediawiki.org/wiki/API:Ratelimit/Wikimedia_sites . If that does not help, please file a new ticket - thanks a lot!

Change #1237694 had a related patch set uploaded (by Xqt; author: Xqt):

[pywikibot/core@master] UA: use URL to user page in user agent

https://gerrit.wikimedia.org/r/1237694