Just wanted to give everyone a head's up that Bingle/Bugello are broken after yesterday's BZ upgrade. At quick glance it appears related to BZ's move to a shared host - the tools are dying with the exception:
requests.exceptions.SSLError: hostname 'bugzilla.wikimedia.org' doesn't match either of '*.planet.wikimedia.org', 'planet.wikimedia.org'
I'll be digging into this shortly and will update when I have more info and/or resolution.
I can make the tools work by setting SSL verification to False, but this doesn't seem like the soundest approach. Looking at the SSL cert info from https://bugzilla.wikimedia.org looks valid to me - anyone know what might be causing this?
On Thu, Feb 13, 2014 at 12:41 PM, Arthur Richards arichards@wikimedia.orgwrote:
Just wanted to give everyone a head's up that Bingle/Bugello are broken after yesterday's BZ upgrade. At quick glance it appears related to BZ's move to a shared host - the tools are dying with the exception:
requests.exceptions.SSLError: hostname 'bugzilla.wikimedia.org' doesn't match either of '*.planet.wikimedia.org', 'planet.wikimedia.org'
I'll be digging into this shortly and will update when I have more info and/or resolution.
-- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687
Even setting verify=False on requests to https://bugzilla.wikimedia.org is resulting in some weirdness. I have a hunch it is due to the fact that tool labs has an extremely outdated version of the Python Requests library, which has a variety of known SSL negotiation issues. I've filed a bugzilla ticket to get Requests upgraded on tool labs: https://bugzilla.wikimedia.org/show_bug.cgi?id=61334
On Thu, Feb 13, 2014 at 12:57 PM, Arthur Richards arichards@wikimedia.orgwrote:
I can make the tools work by setting SSL verification to False, but this doesn't seem like the soundest approach. Looking at the SSL cert info from https://bugzilla.wikimedia.org looks valid to me - anyone know what might be causing this?
On Thu, Feb 13, 2014 at 12:41 PM, Arthur Richards <arichards@wikimedia.org
wrote:
Just wanted to give everyone a head's up that Bingle/Bugello are broken after yesterday's BZ upgrade. At quick glance it appears related to BZ's move to a shared host - the tools are dying with the exception:
requests.exceptions.SSLError: hostname 'bugzilla.wikimedia.org' doesn't match either of '*.planet.wikimedia.org', 'planet.wikimedia.org'
I'll be digging into this shortly and will update when I have more info and/or resolution.
-- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687
-- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687
Arthur,
i think i know the issue here. zirconium is indeed a shared host, so it runs several misc. web services using https on a single IP, so we rely on clients speaking SNI to get the correct virtual host.
java 6 and IE on XP are among the few clients who don't.
I think your applications are java and don't speak SNI, so they are getting the first virtual host, which is planet.
this can be fixed by either:
use Java 7 which should support SNI .. see f.e. http://stackoverflow.com/questions/12361090/server-name-indication-sni-on-ja...
quote " on Java 7 use
new URL("https://cmbntr.sni.velox.ch/%22).openStream()
until HTTPCLIENT-1119 is fixed"
or i can cheat by changing the order Apache loads the site configs, f.e. i could make it
sites-enabled/001-Bugzilla , ./002-Planet etc. Then those clients who don't speak SNI get Bugzilla (but the Planet users don't get their planet, but Bugzilla seems more important.
or we would have to get an extra IP address just for Bugzilla
Bingle is actually a python tool: https://github.com/awjrichards/bingle
Arthur, sorry I spend a couple minutes brainstorming and came up empty. Keep us updated and I'll take a more serious look if the problem persists.
On Thu, Feb 13, 2014 at 3:58 PM, Daniel Zahn dzahn@wikimedia.org wrote:
Arthur,
i think i know the issue here. zirconium is indeed a shared host, so it runs several misc. web services using https on a single IP, so we rely on clients speaking SNI to get the correct virtual host.
java 6 and IE on XP are among the few clients who don't.
I think your applications are java and don't speak SNI, so they are getting the first virtual host, which is planet.
this can be fixed by either:
use Java 7 which should support SNI .. see f.e.
http://stackoverflow.com/questions/12361090/server-name-indication-sni-on-ja...
quote " on Java 7 use
new URL("https://cmbntr.sni.velox.ch/").openStream()
until HTTPCLIENT-1119 is fixed"
or i can cheat by changing the order Apache loads the site configs, f.e. i could make it
sites-enabled/001-Bugzilla , ./002-Planet etc. Then those clients who don't speak SNI get Bugzilla (but the Planet users don't get their planet, but Bugzilla seems more important.
or we would have to get an extra IP address just for Bugzilla _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Either way it's probable not bad to change the order of loading Apache sites to make Bugzilla the default now. If somebody doesn't get what they want, at least they get the Bugtracker
Thanks everyone, Daniel, that makes sense to me.
Some quick sleuthing shows that the requests library has some SSL negotiation issues resolved in 2.x (tool labs is currently running 1.1.0), though I'm not quickly finding specifics. I am hoping an upgrade will make things Just Work but changing the load order of the Apache sites sounds like a smart thing to do regardless. Do we need an RT ticket for that?
On Thu, Feb 13, 2014 at 2:04 PM, Daniel Zahn dzahn@wikimedia.org wrote:
Either way it's probable not bad to change the order of loading Apache sites to make Bugzilla the default now. If somebody doesn't get what they want, at least they get the Bugtracker _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Thu, Feb 13, 2014 at 1:13 PM, Arthur Richards arichards@wikimedia.orgwrote:
things Just Work but changing the load order of the Apache sites sounds like a smart thing to do regardless. Do we need an RT ticket for that?
should be fixed by this:
https://gerrit.wikimedia.org/r/#/c/113265/1
try again ?
That seems to have done it - thanks a ton Daniel!
On Thu, Feb 13, 2014 at 4:46 PM, Daniel Zahn dzahn@wikimedia.org wrote:
On Thu, Feb 13, 2014 at 1:13 PM, Arthur Richards <arichards@wikimedia.org
wrote:
things Just Work but changing the load order of the Apache sites sounds like a smart thing to do regardless. Do we need an RT ticket for that?
should be fixed by this:
https://gerrit.wikimedia.org/r/#/c/113265/1
try again ?
-- Daniel Zahn dzahn@wikimedia.org Operations Engineer _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Actually, I semi-take that back. I am still getting some exceptions like: requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
But at least it's not complaining about a domain mismatch anymore.
On Thu, Feb 13, 2014 at 4:51 PM, Arthur Richards arichards@wikimedia.orgwrote:
That seems to have done it - thanks a ton Daniel!
On Thu, Feb 13, 2014 at 4:46 PM, Daniel Zahn dzahn@wikimedia.org wrote:
On Thu, Feb 13, 2014 at 1:13 PM, Arthur Richards <arichards@wikimedia.org
wrote:
things Just Work but changing the load order of the Apache sites sounds like a smart thing to do regardless. Do we need an RT ticket for that?
should be fixed by this:
https://gerrit.wikimedia.org/r/#/c/113265/1
try again ?
-- Daniel Zahn dzahn@wikimedia.org Operations Engineer _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
-- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687
On Thu, Feb 13, 2014 at 4:58 PM, Arthur Richards arichards@wikimedia.orgwrote:
Actually, I semi-take that back. I am still getting some exceptions like: requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
At first I thought this might be an issue of CA keys not being on tool labs, but that does not appear to be the case. Run from the same host as the tool in question:
local-bingle@tools-login:~$ curl -v https://bugzilla.wikimedia.org * About to connect() to bugzilla.wikimedia.org port 443 (#0) * Trying 208.80.154.41... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using RC4-SHA * Server certificate: * subject: serialNumber=BhQHbaOWi1kF5o57ZgySvt3TVywIQOGI; OU=GT90855227; OU=See www.rapidssl.com/resources/cps (c)13; OU=Domain Control Validated - RapidSSL(R); CN=bugzilla.wikimedia.org * start date: 2013-11-03 18:52:33 GMT * expire date: 2017-11-05 19:36:25 GMT * subjectAltName: bugzilla.wikimedia.org matched * issuer: C=US; O=GeoTrust, Inc.; CN=RapidSSL CA * SSL certificate verify ok.
The only other thing I can think of is my original theory of an issue with the out of date Python Requests library. Anybody else have ideas?
On 14 February 2014 02:50, Arthur Richards arichards@wikimedia.org wrote:
The only other thing I can think of is my original theory of an issue with the out of date Python Requests library. Anybody else have ideas?
python-requests bundles it's own cacert list (although the ubuntu .deb version might use the central certificate store - not sure about that), which might be outdated. Some older cacert lists have issues with RapidSSL certificates (this is an issue with the list bundled with httplib2, for example).
Merlijn
python-requests bundles it's own cacert list (although the ubuntu .deb version might use the central certificate store - not sure about that), which might be outdated. Some older cacert lists have issues with RapidSSL certificates (this is an issue with the list bundled with httplib2, for example).
We use pywikibot's httplib2 which solves the issue Merlijn mentions here. For requests though, the last commit message on their cacert.pem file makes me like your original thought that requests should be updated:
https://github.com/kennethreitz/requests/blob/master/requests/cacert.pem
wikitech-l@lists.wikimedia.org