I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
protection tokens. This allows compromise of the wiki's user accounts, say
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
compromise, especially on private wikis where the set of allowed file types
broader than on public wikis. Note that CSRF allows compromise of a wiki
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
security implications. In the presence of the popular CharInsert extension,
leads to cross-site scripting (XSS). XSS may be possible with other
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
is used (used in the API and the Interwiki extension).
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
* Time and number-formatting magic words also now depend on the page
* Bidirectional support further improved after 1.18.
Full release notes:
Co-inciding with these security releases, the MediaWiki source code
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
commits for these releases will not be appearing in our SVN repository. If
SVN checkouts of MediaWiki for version control, you need to migrate these to
If you up are using tarballs, there should be no change in the process for
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
Patch to previous version (1.19.0beta1), without interface text:
Interface text changes:
I’m delighted to announce that Ken Snider is joining the Wikimedia
operations team. He will start as an international contractor working
remotely from Toronto, Canada on June 10, and will be visiting SF in
the week of June 17. We’re currently in the process of seeking work
authorization in the United States in the Director of TechOps
CT has graciously agreed to support the ops leadership transition
full-time through June, and part-time through July. We’ll be starting
the handover while Ken is working remotely.
A bit more about Ken: Ken was apparently genetically predisposed to
become a sysadmin since he joined one of Canada’s first large ISPs,
Primus, straight out of school in 1997 and helped build their
infrastructure til 2001. He then joined a startup called OpenCOLA in
2001 which was co-founded by Cory Doctorow and developed early P2P
precursors to tools like BitTorrent and Steam. It’s best known today
for the development of an open source (GPL’d) cola recipe which is
still in use (more than 150,000 cans sold if Wikipedia is to be
Ken got involved in one of Cory’s pet projects, BoingBoing.net which
some of you may have heard of ;-), and has been their sysadmin since
2003. After a stint from 2001-2005 at DataWire, Ken became Director of
Tech Ops at Federated Media, a role he held from 2005-2012.
Federated Media is an ad network that was founded to support high
traffic blogs and sites that want to stay independent of large
publishers, with a network that supports more than 1B requests/day.
One of the unusual challenges at FM was that the company grew through
acquisitions of various blogging and publishing networks. This led to
the challenge of integrating very heterogeneous operations and
engineering infrastructure, including multiple geographically
distributed ops teams and data-center locations. As DTO, Ken led these
efforts, such as OS standardization, development of a unified
deployment infrastructure, etc. Ken also ensured that the operations
group partnered effectively with the various engineering teams
developing site features and enhancements.
I want to again take this opportunity to thank CT Woo for his tireless
operations leadership since December 2010. I’d also like to thank
everyone who’s participated in the Director of TechOps search process.
Please join me in welcoming Ken to the Wikimedia Foundation and the
VP of Engineering and Product Development, Wikimedia Foundation
The new version of git-review released today (1.22) includes a patch I
wrote that makes it possible to work against a single 'origin' remote. This
amounts to a workaround for git-review's tendency to frighten you into
thinking you're about to submit more patches than the ones you are working
on. It makes git-review more pleasant to work with, in my opinion.
To enable this behavior, you first need to upgrade to the latest version of
git-review, by running "pip install -U git-review". Then you need to create
a configuration file: either /etc/git-review/git-review.conf (system-wide)
or ~/.config/git-review/git-review.conf (user-specific).
The file should contain these two lines:
defaultremote = origin
Once you've made the change, any new Gerrit repos you clone using an
authenticated URI will just work.
You'll need to perform an additional step to migrate existing repositories.
In each repository, run the following commands:
git remote set-url origin $(git config --get remote.gerrit.url)
git remote rm gerrit
git review -s
Hope you find this useful.
TL;DR: A few ideas follow on how we could possibly help legit editors
contribute from behind Tor proxies. I am just conversant enough with
the security problems to make unworkable suggestions ;-), so please
correct me, critique & suggest solutions, and perhaps volunteer to help.
The current situation:
We generally don't let anyone edit or upload from behind Tor; the
TorBlock extension stops them. One exception: a person can create an
account, accumulate lots of good edits, and then ask for an IP block
exemption, and then use that account to edit from behind Tor. This is
unappealing because then there's still a bunch of in-the-clear editing
that has to happen first, and because then site functionaries know that
the account is going to be making controversial edits (and could
possibly connect it to IPs in the future, right?). And right now
there's no way to truly *anonymously* contribute from behind Tor
for Tor users, I'm not sure how much editing from Tor -- vandalism or
legit -- is actually happening. (I hope for analytics on this and thus
added it to https://www.mediawiki.org/wiki/Analytics/Dreams .) We know
at least that there are legitimate editors who would prefer to use Tor
People have been talking about how to improve the situation for some
time -- see http://cryptome.info/wiki-no-tor.htm and
. It'd be nice if it could actually move forward.
I've floated this problem past Tor and privacy people, and here are a
1) Just use the existing mechanisms more leniently. Encourage the
communities (Wikimedia & Tor) to use
https://en.wikipedia.org/wiki/Wikipedia:Request_an_account (to get an
account from behind Tor) and to let more people get IP block exemptions
even before they've made any edits (< 30 people have gotten exemptions
on en.wp in 2012). Add encouraging "get an exempt account" language to
the "you're blocked because you're using Tor" messaging. Then if
there's an uptick in vandalism from Tor then they can just tighten up again.
2) Encourage people with closed proxies to re-vitalize
https://en.wikipedia.org/wiki/Wikipedia:WOCP . Problem: using closed
proxies is okay for people with some threat models but not others.
3) Look at Nymble - http://freehaven.net/anonbib/#oakland11-formalizing
and http://cgi.soic.indiana.edu/~kapadia/nymble/overview.php . It would
allow Wikimedia to distance itself from knowing people's identities, but
still allow admins to revoke permissions if people acted up. The user
shows a real identity, gets a token, and exchanges that token over tor
for an account. If the user abuses the site, Wikimedia site admins can
blacklist the user without ever being able to learn who they were or
what other edits they did. More: https://cs.uwaterloo.ca/~iang/ Ian
Golberg's, Nick Hopper's, and Apu Kapadia's groups are all working on
Nymble or its derivatives. It's not ready for production yet, I bet,
but if someone wanted a Big Project....
3a) A token authorization system (perhaps a MediaWiki extension) where
the server blindly signs a token, and then the user can use that token
to bypass the Tor blocks. (Tyler mentioned he saw this somewhere in a
Bugzilla suggestion; I haven't found it.)
4) Allow more users the IP block exemption, possibly even automatically
after a certain number of unreverted edits, but with some kind of
FlaggedRevs integration; Tor users can edit but their changes have to be
reviewed before going live. We could combine this with (3); Nymble
administrators or token-issuers could pledge to review edits coming from
Tor. But that latter idea sounds like a lot of social infrastructure to
set up and maintain.
Thoughts? Are any of you interested in working on this problem? #tor on
the OFTC IRC server is full of people who'd be interested in talking
Engineering Community Manager
Based on many ideas that were put forth, I would like to seek comments on
this ZERO design. This HTML will be rendered for both M and ZERO subdomains
if varnish detects that request is coming from a zero partner. M and ZERO
will be identical except for the images - ZERO substitutes images with
links to File:xxx namespace through a redirector.
devices, it will load carrier configuration and replace the link with local
either silently 301-redirect or show confirmation HTML. Links to images on
ZERO.wiki and all external links are done in similar way.
* The banner is an ESI link to */w/api.php?action=zero&banner=250-99* -
returns HTML <div> blob of the banner. (Not sure if banner ID should be
part of the URL)
Expected cache fragmentation for each wiki page:
* per subdomain (M|ZERO)
* if M - per "isZeroCarrier" (TRUE|FALSE). if ZERO - always TRUE.
3 variants is much better then one per carrier ID * 2 per subdomain.
Redirector is a Special:Zero page, but if speed is an issue, it could be an
API calls (which seem to load much faster). The API call would redirect to
the target, or could either redirect to the special page for confirmation
rendering, or output HTML itself (no skin support, but avoids an extra
our target platforms now or soon.
I wrote a tool that will import bugs from Bugzilla into either Mingle
and/or Trello (two project management tools used by some teams at the
Wikimedia Foundation). The mobile web team was finding it difficult to keep
track of two separate tools - one for new feature development, the other
for tracking bugs, so Bingle helps bridge the gap and allows us to focus on
one tool. This has had the side effect of keeping visibility of reported
bugs high and has made it easier for us to quickly prioritize incoming bugs
against existing work, and quickly respond to open issues.
You can find the code and some rudimentary usage instructions here:
I hacked this together rather quickly - expedience was my goal rather than
perfection, so it's not well documented, a little quirky, and there's a lot
of room for improvement. I've been sitting on it for a while, hoping to
make improvements before announcing it, but I have not found the time to
make the changes I would like (eg for it to use the Bugzilla API rather
than Bugzilla atom feeds). So, I invite anyone interested and willing to
fork it, and pitch in and help make it awesome :)
Software Engineer, Mobile
As multilingual content grows, interlanguage links become longer on
Wikipedia articles. Articles such as "Barak Obama" or "Sun" have more than
200 links, and that becomes a problem for users that often switch among
As part of the future plans for the Universal Language Selector, we were
- Show only a short list of the relevant languages for the user based on
geo-IP, previous choices and browser settings of the current user. The
language the users are looking for will be there most of the times.
- Include a "more" option to access the rest of the languages for which
the content exists with an indicator of the number of languages.
- Provide a list of the rest of the languages that users can easily scan
(grouped by script and region ao that alphabetical ordering is possible),
and search (allowing users to search a language name in another language,
using ISO codes or even making typos).
I have created a prototype <http://pauginer.github.io/prototype-uls/#lisa> to
illustrate the idea. Since this is not connected to the MediaWiki backend,
it lacks the advanced capabilities commented above but you can get the idea.
If you are interested in the missing parts, you can check the flexible
search and the list of likely languages ("common languages" section) on the
language selector used at http://translatewiki.net/ which is connected to
As part of the testing process for the ULS language settings, I included a
task to test also the compact interlanguage designs. Users seem to
understand their use (view
but I wanted to get some feedback for changes affecting such an important
Please let me know if you see any possible concern with this approach.
in December I mentioned the idea of having a "PATCH_AVAILABLE" or
"PATCH_TO_REVIEW" status in Bugzilla  and that we should re-evaluate
the idea once we have automatic notifications from Gerrit into Bugzilla
in place . This is now the case .
>From the Amsterdam Hackathon I know that some developers would like to
filter on bug reports that have or don't have a patch in Gerrit, and
easier finding of bug reports with a corresponding patch && lack of
recent changes might provide another entry point for new developers
(pick up the existing patch and finish it).
Hence I propose
* to remove the manually set and error-prone Bugzilla keyword
"patch-in-gerrit": Every bug on its way to get RESOLVED FIXED
has to pass this stage anyway so a status feels more
* to make the "Gerrit Notification Bot" automatically change the
bug report status to "PATCH_AVAILABLE"/"PATCH_TO_REVIEW" in
Bugzilla when a patch for that bug report has been committed
(not: merged) to Gerrit.
PS: Making the Gerrit notification bot automatically close bug reports
in Bugzilla after merging a patch in Gerrit, or differentiating in
Bugzilla between "RESOLVED FIXED" (fix merged) and "RELEASED" (fix
deployed on the Wikimedia wikisites) are also interesting topics to
discuss at some point, but not in this thread. One step at a time.
Andre Klapper | Wikimedia Bugwrangler
The developer team at Wikimedia is making some changes to how accounts
work, as part of our on-going efforts to provide new and better tools
for our users (like cross-wiki notifications). These changes will mean
users have the same account name everywhere, will let us give you new
features that will help you edit & discuss better, and will allow more
flexible user permissions for tools. One of the pre-conditions for
this is that user accounts will now have to be unique across all 900
Unfortunately, some accounts are currently not unique across all our
wikis, but instead clash with other users who have the same account
name. To make sure that all of these users can use Wikimedia's wikis
in future, we will be renaming a number of accounts to have "~” and
the name of their wiki added to the end of their accounts' name. This
change will take place on or around 27 May. For example, a user called
“Example” on the Swedish Wiktionary who will be renamed would become
All accounts will still work as before, and will continue to be
credited for all their edits made so far. However, users with renamed
accounts (whom we will be contacting individually) will have to use
the new account name when they log in.
It will now only be possible for accounts to be renamed globally; the
RenameUser tool will no longer work on a local basis - since all
accounts must be globally unique - therefore it will be withdrawn from
bureaucrats' tool sets. It will still be possible for users to ask on
Meta for their account to be renamed further, if they do not like
their new user name, once this takes place.
A copy of this note is posted to meta  for translation. Please
forward this to your local communities, and help get it translated.
Individuals who are affected will be notified via talk page and e-mail
notices nearer the time.
 - https://meta.wikimedia.org/wiki/Help:Unified_login
 - https://meta.wikimedia.org/wiki/Single_User_Login_finalisation_announcement
James D. Forrester
Product Manager, VisualEditor
Wikimedia Foundation, Inc.
jforrester(a)wikimedia.org | @jdforrester