Back in May 2004, Gabriel Wicke was creating a neat new skin called Monobook. Unlike the old skins, it used good semantic markup with CSS 2 for style. Gabriel made sure to test in a lot of browsers and made up files full of extensive fixes for browsers that had problems.
One such browser was the default KDE browser, Konqueror. Even relatively web-savvy people have barely heard of it and never used it. He was nice and checked whether it worked properly in his new skin anyway. Since it didn't, he committed a quick fix in r3532 to eliminate some horizontal scrollbars. Then everyone forgot about it, because nobody uses KHTML.
It turns out there was a slight problem with his fix. He loaded it based on this code:
var is_khtml = (navigator.vendor == 'KDE' || ( document.childNodes && !document.all && !navigator.taintEnabled ));
The problem here is pretty straightforward. A bug fix is being loaded, without checking to see whether the bug exists. The fix is loaded for all versions of KHTML past, present, and *future*. If the KHTML devs fixed the bug, then they'd have a bogus stylesheet being loaded that would mess up their display, and they couldn't do anything about it.
Well, nobody much used or uses KHTML. But it just so happens that in 2003, Apple debuted a new web browser based on a fork of KHTML. And in 2008, Google debuted another browser based on the same rendering engine. And if you add them together, they now have 6% market share or more. And we've still been serving them this broken KHTML fixes file for something that was fixed eons ago.
Just recently, in WebKit r47255, they changed their code to better match other browsers' handling of "almost standards mode". They removed some quirk that was allowing them to render correctly despite the bogus CSS we were serving them. And so suddenly they're faced with the prospect of having to use a site-specific hack ("if path ends in /KHTMLFixes.css, ignore the file") because we screwed up. See their bug here: https://bugs.webkit.org/show_bug.cgi?id=28350
I had already killed KHTMLFixes.css in r53141, but it's still in every MediaWiki release since 1.5. And this isn't the only time this has happened. A while back someone committed some fixes for Opera RTL. They loaded the fixes for, yes, Opera version 9 or greater, or some similar check. When I checked on Opera 9.6, I found that the fix was degrading display, not improving it.
Sometimes we need to do browser sniffing of some kind, because sometimes browsers don't implement standards properly. There are two ways to do it that are okay:
1) Capability testing. If possible, just check directly whether the browser can do it. This works best with JS functionality, for instance in getElementsByClassName in wikibits.js:
if ( typeof( oElm.getElementsByClassName ) == "function" ) { /* Use a native implementation where possible FF3, Saf3.2, Opera 9.5 */
It can also be used in other cases sometimes. For instance, in r53347 I made this change:
- // TODO: better css2 incompatibility detection here - if(is_opera || is_khtml || navigator.userAgent.toLowerCase().indexOf('firefox/1')!=-1){ - return 30; // opera&konqueror & old firefox don't understand overflow-x, estimate scrollbar width + // For browsers that don't understand overflow-x, estimate scrollbar width + if(typeof document.body.style.overflowX != "string"){ + return 30;
Instead of using a hardcoded list of browsers that didn't support overflow-x, I checked whether the overflowX property existed. This isn't totally foolproof, but it sure bets assuming that no future version of Opera or KHTML will support overflow-x. (I'm pretty sure both already do, in fact.)
2) "Version <= X." If it's not reasonable to check capabilities, then at least allow browser implementers to fix their bugs in future versions. If you find that all current versions of Firefox do something or other incorrectly, then don't serve incorrect content to all versions of Firefox. In that case, during Firefox 3.6 development, they'll find out that their improvements to standards compliance cause Wikipedia to break! Instead, serve incorrect content to Firefox 3.5 or less, and standard markup to all greater versions. That way, during 3.6 development, they'll find out that their *failure* to comply with standards causes Wikipedia to break. With any luck, that will encourage them to fix the problem instead of punishing them. It's not as good as being able to automatically serve the right content if they haven't fixed things, but it's better than serving bad content forever.
I tried to remove some browser-sniffing from wikibits.js, but there's undoubtedly some I missed. Especially with the large amounts of JS being added recently for usability/new upload/etc., could everyone *please* check to make sure that there are no broken browser checks being committed? This kind of thing hurts our users in the long term (especially third parties who don't upgrade so often), and is really unfair to browser developers who are trying to improve their standards compliance. Thanks!