I visited en.wikivoyage.com by mistake, and dicocvered that
en.wikivoyage.com appears to serve up Wikivoyage-old's content, but from
188.8.131.52, which whois identifies as being in the netblock
HETZNER-RZ10 operated by Hetzner Online AG.
The wikivoyage.com domain itself has a last-modified date of 2012-10-14.
Even more puzzlingly, the PTR record for 184.108.40.206 currently points
Can anyone cast any light on this? Is this officially sanctioned by the
Basics: We rolled out 1.21wmf5 to the non-Wikipedia sites today, after a
brief reversion and re-deployment to fix breakage in how we were
displaying some styling. We are on track to deploy 1.21wmf5 to English
Wikipedia on Monday, December 3 per
Below: why this happened and how it got fixed, and what we should change
to prevent problems like this in the future.
https://gerrit.wikimedia.org/r/#/c/30361/ changed the headings in the
Vector skin. The new code didn't take the WMF config into account, as
the author wasn't expecting styles and HTML to be cached in such
The headings were changed from "h4"/"h5", but the CSS used those tags to
identify them (instead of using CSS classes). Which means, as expected,
that the page layout breaks for up to 30 days.
Page cache is controlled by the wiki page content. Unless the page is
modified, the cache is kept for up to 30 days for anonymous users.
Resource modules, however, are served by ResourceLoader which has its
own much more efficient and deployable cache mechanism. But this means
that the resources for the skin are deployed globally and site-wide
within 5 minutes.... whereas the HTML isn't for another 2 weeks.
The issues that caused were visible in beta labs for the last three
days, but none of us realized they were significant, we thought they
were caused by a misconfigured memcache; see
We knew that this particular change and the related change
https://gerrit.wikimedia.org/r/#/c/34702/ might be problematic and sent
out a note about it on Monday --
-- but it looks like we didn't test thoroughly enough on Monday and
Tuesday to catch it before the Wednesday deploy. Only anonymous users
would have been affected. We don't cache logged-in users in Squid. So
logged-in users didn't notice problems on mediawiki.org and
test2.wikipedia.org after the first deploy.
Problems popped up after the Phase 2 deployment to non-Wikipedia sites,
so we reverted the 1.21wmf5 deployment and then redeployed while fixing,
Gerrit changes: https://gerrit.wikimedia.org/r/#/c/35819 ,
What we should fix for the future:
This is why client resources must always be backwards compatible.
"Don't change the HTML in incompatible ways" is probably a good
general rule to live by--but having an easy way to say "start purging
all pages on $theseWikis from Squid/Varnish" would also be nice.
get more manual testing on test2.wikipedia.org and mediawiki.org
immediately after Phase I deployment, including as anonymous reader and
editor to ensure we catch Squid caching issues
train more people to review code well, to reduce backlog and catch
these kinds of problems?
get more people to +2 in core and in important extensions
beta labs needs to be trustworthy enough to make this sort of thing
a blocker immediately
Chris McMahon's take: (for what it's worth, this seems to me to be
a sign that beta labs is becoming more and more trustworthy all the
time. The more we actually use it, the more we'll understand what does
and does not work there. We fixed the memcache problem, which fixed the
ability to login, but didn't investigate the display problems because
we're used to beta not being very reliable. In this case, beta was
reliable, and we didn't understand that. Even with a bug report in
bugzilla with 9 subscribers, no one recognized a real issue.)
Chris McMahon said: I think this could be framed as an issue of signal,
noise, and bandwidth. Beta labs being broken a lot, review backlog in
gerrit, false failures in tests are all noise. Given the constraints of
ongoing projects, it is difficult to pick out the signal from the noise.
We can take steps to reduce the noise so that the signal stands out
more by reducing technical debt: make the tests green, make the test
environment robust, keep up with code review.
(I assembled this just now from IRC & mailing list chatter from several
people, and errors are mine -- sorry for missing attributions here.
Drafting was on http://etherpad.wmflabs.org/pad/p/nov-28-2012-deploysnafu )
Engineering Community Manager
Tomorrow, we have two topics on the agenda for the weekly tech chat / brown bag:
* a mobile QA guest speaker: Pete Hodgson, with Khali Young from
ThoughtWorks, will talk about automated testing (and cross-platform
development strategies). See Pete's blog at: http://blog.thepete.net/
* Sumana is planning a walkthrough on "how to fix a bug" from start to
finish, suitable for newbie developers.
As always, will be live-streamed, recorded, transmogrified, etc. Sign
up here, come in person or on #wikimedia-dev tomorrow:
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
FYI. I will not be posting release announcements here, but feedback on
the implementation is welcome.
Code is at: https://gerrit.wikimedia.org/r/gitweb?p=translatewiki.git;a=tree;f=melange
Maybe there is interest in making it more generic and use it to make
other extension bundles too. In that case it we could create own repo
for it. Some known TODOs on top of my head:
* Signing the archive
* Running of PHPUnit tests is currently broken
* QUnit tests need to be run manually
---------- Forwarded message ----------
The Wikimedia Language Engineering team is pleased to announce the
first release of the MediaWiki Language Extension Bundle. The bundle
is a collection of selected MediaWiki extensions needed by any wiki
which desires to be multilingual.
This first bundle release (2012.11) is compatible with MediaWiki 1.19,
1.20 and 1.21alpha.
Get it from https://www.mediawiki.org/wiki/MLEB
The Universal Language Selector is a must have, because it provides an
essential functionality for any user regardless on the number of
languages he/she speaks: language selection, font support for
displaying scripts badly supported by operating systems and input
methods for typing languages that don't use Latin (a-z) alphabet.
Maintaining multilingual content in a wiki is a mess without the
Translate extension, which is used by Wikimedia, KDE and
translatewiki.net, where hundreds of pieces of documentation and
interface translations are updated every day; with Localisation Update
your users will always have the latest translations freshly out of the
oven. The Clean Changes extension keeps your recent changes page
uncluttered from translation activity and other distractions.
Don't miss the chance to practice your rusty language skills and use
the Babel extension to mark the languages you speak and to find other
speakers of the same language in your wiki. And finally the cldr
extension is a database of language and country translations.
We are aiming to make new release every month, so that you can easily
stay on the cutting edge with the constantly improving language
support. The bundle comes with clear installation and upgrade
installations. The bundle is tested against MediaWiki release
versions, so you can avoid most of the temporary breaks that would
happen if you were using the latest development versions instead.
Because this is our first release, there can be some rough edges.
Please provide us a lot of feedback so that we can improve for the
FYI, for developers who might be interested.
---------- Forwarded message ----------
From: Steven Walling <swalling(a)wikimedia.org>
Date: Wed, Nov 28, 2012 at 8:50 PM
Subject: IRC office hours about account creation and login redesign
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
As you might have noticed, especially if you're an English Wikipedian, the
Editor Engagement Experiments team has been working on redesigning the
user experience of account creation, A/B testing new designs and
functionality for the past couple months.
We finished our final A/B test last week, and we're now moving on to
make the features which tested well permanent. In order to make sure
that the experience of signup and login are consistent, we also plan to
make some changes to the design of login.
In order to answer any questions people might have and gather feedback,
we're holding the first office hours about our redesign work. We also plan
to enable the test version of the new account creation experience at 100%
(rather than 50/50, as previously) so that people can give it a try.
When: Saturday December 1, 2012. 19:00-20:00 UTC. Time conversion links
etc. are on Meta.
Here is a first stab for a draft proposal to organize our volunteer
Written after some lousy discussions with Chris and Sumana, and reading
a bunch of related wiki pages. Your feedback is welcome.
Ideally this _theory_ will be immediately applicable to some pilots that
we can run in the upcoming weeks. The Language and Mobile teams seem to
be ready for a try - maybe even before the end of the year. Visual
Editor and Editor Engagement teams might come next in January.
The door is open for any other project willing to run QA activities with
volunteers. Just let me know.
Technical Contributor Coordinator
These are prompted by Gerrit review comments, but it seems like a hidden
place to discuss them.
The specific background:
A task I am on uses an external library licensed under the Apache2.0
license. Mediawiki core is shipped under a GPL2 license, so I'm guessing
that is the default licence fondation work is released under.
According to http://en.wikipedia.org/wiki/Apache_License#GPL_compatibilityApache
licenses are compatible with GPL3 but not GPL2.
Is my assumption that by default we put everything under GPL2 right?
Are other FS/OSS licences OK to use?
How paranoid are we? ie do we make a good faith effort at getting it right,
or do we refer questions to internal counsel for a slower but safer answer?
In this case my inclination is to licence the whole extension (containing
the external library) as Apache2.0 but I'm happy to defer to normal process
if there is one.