Hi guys,
I'm pretty sure Pywikibot is not affected by this unless you're using an ancient version or you forced the bot to http in your configuration. Brandon, do you see any pywikibot based bots in your logs that would be affected by this?
Maarten
-------- Doorgestuurd bericht -------- Onderwerp: [Wikitech-l] Insecure (non-HTTPS) API Requests to become unsupported starting 2016-06-12 Datum: Fri, 13 May 2016 22:34:20 +0000 Van: Brandon Black bblack@wikimedia.org Antwoord-naar: Wikimedia developers wikitech-l@lists.wikimedia.org Aan: mediawiki-api-announce@lists.wikimedia.org, mediawiki-api@lists.wikimedia.org, Wikimedia developers wikitech-l@lists.wikimedia.org
TL;DR: ---- * All access to Wikimedia production sites/APIs should use https:// URLs, not http:// -- your bot/tool will break in the near future if it does not! * 2016-06-12 - insecure access is unsupported; starting on this date we plan to break (deny with 403) 10% of all insecure requests randomly as a wake-up call. * 2016-07-12 - we plan to break all insecure requests. ----
Hi all,
As you may remember, all production Wikimedia wikis switched to HTTPS-only for all canonical domainnames nearly a year ago: https://blog.wikimedia.org/2015/06/12/securing-wikimedia-sites-with-https/
Since way back then, we've been forcing insecure HTTP requests to our canonical domains over to HTTPS by using redirects and Strict-Transport-Security, which is effective for the vast majority of access from humans using browsers and apps.
In the time since, we've been chasing down various corner-case issues where loopholes may arise in our HTTPS standards and enforcement. One of the most-difficult loopholes to close has been the "Insecure POST" loophole, which is discussed in our ticket system here: https://phabricator.wikimedia.org/T105794 .
To briefly recap the "Insecure POST" issue:
* Most of our humans using browser UAs are not affected by it. They start out doing GET traffic to our sites, their GETs get redirected to HTTPS if necessary, and then any POSTs issued by their browser use protocol-relative URIs which are also HTTPS. * However, many automated/code UAs (bots, tools, etc) access the APIs using initial POST requests to hardcoded service URLs using HTTP (rather than HTTPS). * For all of the code/library UAs out there in the world, there is no universally-compatible way to redirect them to HTTPS. There are different ways that work for some UAs, but many UAs used for APIs don't handle redirects at all. * Regardless of the above, even if we could reliably redirect POST requests, that doesn't fix the security problem like it does with GET. The private data has already been leaked in the initial insecure request before we have a chance to redirect it. If we did some kind of redirect first, we'd still just be putting off the inevitable future date where we have to go through a breaking transition to secure the data.
Basically, we're left with no good way to upgrade these insecure requests without breaking them. The only way it gets fixed is if all of our API clients in the world use explicit https:// URLs for Wikimedia sites in all of their code and configuration, and the only way we can really force them to do so is to break insecure POST requests by returning a 403 error to tools that don't.
Back in July 2015, I began making some efforts to statistically sample the User-Agent fields of clients doing "Insecure POST" and tracking down the most-prominent offenders. We were able to find and fix many clients along the way since.
A few months ago Bryan Davis got us further when he committed a MediaWiki core change to let our sites directly warn offending clients. I believe that went live on Jan 29th of this year ( https://gerrit.wikimedia.org/r/#/c/266958 ). It allows insecure POSTs to still succeed, but sends the clients a standard warning that says "HTTP used when HTTPS was expected".
This actually broke some older clients that weren't prepared to handle warnings at all, and caused several clients to upgrade. We've been logging offending UAs and accounts which trigger the warning via EventLogging since then, but after the initial impact the rate flattened out again; clients and/or users that didn't notice the warning fairly quickly likely never will.
Many of the remaining UAs we see in logs are simply un-updated. For example, https://github.com/mwclient/mwclient switched to HTTPS-by-default in 0.8.0, released in early January, but we're still getting lots of insecure POST from older mwclient versions installed out there in the world. Even in cases where the code is up to date and supports HTTPS properly, bot/tool configurations may still have hardcoded http:// site config URLs.
We're basically out of "soft" ways to finish up this part of the HTTPS transition, and we've stalled long enough on this.
** 2016-06-12 is the selected support cutoff date **
After this date, insecure HTTP POST requests to our sites are officially unsupported. This date is:
* A year to day after the public announcement that our sites are HTTPS only * ~ 11 months after we began manually tracking down top offenders and getting them fixed * ~ 4 months after we began sending warning messages in the response to all insecure POST requests to the MW APIs * ~ 1 month after this email itself
On the support cutoff date, we’ll begin emitting a “403 Insecure POST Forbidden - use HTTPS” failure for 10% of all insecure POST traffic (randomly-selected). Some clients will retry around this, and hopefully the intermittent errors will raise awareness more-strongly than the API warning message and this email did.
A month later (two months out from this email) on 2016-07-12 we plan to break insecure access completely (all insecure requests get the 403 response).
In the meantime, we'll be trying to track down offending bots/tools from our logs and trying to contact owners who haven't seen these announcements. Our Community team will be helping us communicate this message more-directly to affected Bot accounts as well.
Thank you all for your help during this transition! -- Brandon Black Sr Operations Engineer Wikimedia Foundation
_______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi folks,
looking at the current travis builds [1] core 2.0 branch looks deprecated. A lot of bug fixes aren't ported back; last commits are few months old. It is more and more difficult to synchronize 2.0 with master branch or keep the bot up to date. Any ideas?
Best xqt
[1] https://travis-ci.org/wikimedia/pywikibot-core/builds/129633211
Can you provide some background info on what the 2.0 branch is and what it's for?
I've been doing all of my development off master, and in general, I prefer a single-mainline development structure.
On Sat, May 14, 2016 at 6:39 AM info@gno.de wrote:
Hi folks,
looking at the current travis builds [1] core 2.0 branch looks deprecated. A lot of bug fixes aren't ported back; last commits are few months old. It is more and more difficult to synchronize 2.0 with master branch or keep the bot up to date. Any ideas?
Best xqt
[1] https://travis-ci.org/wikimedia/pywikibot-core/builds/129633211
pywikibot mailing list pywikibot@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikibot