I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
protection tokens. This allows compromise of the wiki's user accounts, say
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
compromise, especially on private wikis where the set of allowed file types
broader than on public wikis. Note that CSRF allows compromise of a wiki
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
security implications. In the presence of the popular CharInsert extension,
leads to cross-site scripting (XSS). XSS may be possible with other
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
is used (used in the API and the Interwiki extension).
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
* Time and number-formatting magic words also now depend on the page
* Bidirectional support further improved after 1.18.
Full release notes:
Co-inciding with these security releases, the MediaWiki source code
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
commits for these releases will not be appearing in our SVN repository. If
SVN checkouts of MediaWiki for version control, you need to migrate these to
If you up are using tarballs, there should be no change in the process for
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
Patch to previous version (1.19.0beta1), without interface text:
Interface text changes:
TL;DR: A few ideas follow on how we could possibly help legit editors
contribute from behind Tor proxies. I am just conversant enough with
the security problems to make unworkable suggestions ;-), so please
correct me, critique & suggest solutions, and perhaps volunteer to help.
The current situation:
We generally don't let anyone edit or upload from behind Tor; the
TorBlock extension stops them. One exception: a person can create an
account, accumulate lots of good edits, and then ask for an IP block
exemption, and then use that account to edit from behind Tor. This is
unappealing because then there's still a bunch of in-the-clear editing
that has to happen first, and because then site functionaries know that
the account is going to be making controversial edits (and could
possibly connect it to IPs in the future, right?). And right now
there's no way to truly *anonymously* contribute from behind Tor
for Tor users, I'm not sure how much editing from Tor -- vandalism or
legit -- is actually happening. (I hope for analytics on this and thus
added it to https://www.mediawiki.org/wiki/Analytics/Dreams .) We know
at least that there are legitimate editors who would prefer to use Tor
People have been talking about how to improve the situation for some
time -- see http://cryptome.info/wiki-no-tor.htm and
. It'd be nice if it could actually move forward.
I've floated this problem past Tor and privacy people, and here are a
1) Just use the existing mechanisms more leniently. Encourage the
communities (Wikimedia & Tor) to use
https://en.wikipedia.org/wiki/Wikipedia:Request_an_account (to get an
account from behind Tor) and to let more people get IP block exemptions
even before they've made any edits (< 30 people have gotten exemptions
on en.wp in 2012). Add encouraging "get an exempt account" language to
the "you're blocked because you're using Tor" messaging. Then if
there's an uptick in vandalism from Tor then they can just tighten up again.
2) Encourage people with closed proxies to re-vitalize
https://en.wikipedia.org/wiki/Wikipedia:WOCP . Problem: using closed
proxies is okay for people with some threat models but not others.
3) Look at Nymble - http://freehaven.net/anonbib/#oakland11-formalizing
and http://cgi.soic.indiana.edu/~kapadia/nymble/overview.php . It would
allow Wikimedia to distance itself from knowing people's identities, but
still allow admins to revoke permissions if people acted up. The user
shows a real identity, gets a token, and exchanges that token over tor
for an account. If the user abuses the site, Wikimedia site admins can
blacklist the user without ever being able to learn who they were or
what other edits they did. More: https://cs.uwaterloo.ca/~iang/ Ian
Golberg's, Nick Hopper's, and Apu Kapadia's groups are all working on
Nymble or its derivatives. It's not ready for production yet, I bet,
but if someone wanted a Big Project....
3a) A token authorization system (perhaps a MediaWiki extension) where
the server blindly signs a token, and then the user can use that token
to bypass the Tor blocks. (Tyler mentioned he saw this somewhere in a
Bugzilla suggestion; I haven't found it.)
4) Allow more users the IP block exemption, possibly even automatically
after a certain number of unreverted edits, but with some kind of
FlaggedRevs integration; Tor users can edit but their changes have to be
reviewed before going live. We could combine this with (3); Nymble
administrators or token-issuers could pledge to review edits coming from
Tor. But that latter idea sounds like a lot of social infrastructure to
set up and maintain.
Thoughts? Are any of you interested in working on this problem? #tor on
the OFTC IRC server is full of people who'd be interested in talking
Engineering Community Manager
How to load up high-resolution imagery on high-density displays has been an
open question for a while; we've wanted this for the mobile web site since
the Nexus One and Droid brought 1.5x, and the iPhone 4 brought 2.0x density
displays to the mobile world a couple years back.
More recently, tablets and a few laptops are bringing 1.5x and 2.0x density
displays too, such as the new Retina iPad and MacBook Pro.
A properly responsive site should be able to detect when it's running on
such a display and load higher-density image assets automatically...
Here's my first stab:
* adds $wgResponsiveImages setting, defaulting to true, to enable the
* adds jquery.hidpi plugin to check window.devicePixelRatio and replace
images with data-src-1-5 or data-src-2-0 depending on the ratio
* adds mediawiki.hidpi RL script to trigger hidpi loads after main images
* renders images from wiki image & thumb links at 1.5x and 2.0x and
includes data-src-1-5 and data-src-2-0 attributes with the targets
Note that this is a work in progress. There will be places where this
doesn't yet work which output their imgs differently. If moving from a low
to high-DPI screen on a MacBook Pro Retina display, you won't see images
load until you reload.
Confirmed basic images and thumbs in wikitext appear to work in Safari 6 on
MacBook Pro Retina display. (Should work in Chrome as well).
Same code loaded on MobileFrontend display should also work, but have not
yet attempted that.
Note this does *not* attempt to use native SVGs, which is another potential
tactic for improving display on high-density displays and zoomed windows.
This loads higher-resolution raster images, including rasterized SVGs.
There may be loads of bugs; this is midnight hacking code and I make no
guarantees of suitability for any purpose. ;)
i'm hopeful this is the appropriate venue for this topic - i recently
had occasion to visit #mediawiki on freenode, looking for help. i found
myself a bit frustrated by the amount of bot activity there and wondered
if there might be value in some consideration for this. it seems to
frequently drown out/dilute those asking for help, which can be a bit
discouraging/frustrating. additionally, from the perspective of those
who might help [based on my experience in this role in other channels],
constant activity can sometimes engender disinterest [e.g. the irc
client shows activity in the channel, but i'm less inclined to look as
it's probably just a bot].
to offer one possibility - i know there are a number of mediawiki and/or
wikimedia related channels - might there be one in which bot activity
might be better suited, in the context of less contention between the
two audiences [those seeking help vs. those interested in development,
etc]? one nomenclature convention that seems to be at least somewhat of
a defacto standard is #project for general help, and #project-dev[el]
for development topics. a few examples of this i've seen are android,
libreoffice, python, and asterisk. adding yet another channel to this
list might not be terribly welcome, but maybe the distinction would be
worth the addition?
as i'm writing this, i see another thread has begun wrt freenode, and i
also see a bug filed that relates at least to some degree
[https://bugzilla.wikimedia.org/show_bug.cgi?id=35427], so i may just be
repeating an existing sentiment, but i wanted to at least offer a brief
is there already a schedule to update jQuery and jQuery UI to 1.9 or are
there problems at moment? I want to use the new tooltip widget of jQuery UI
A number of people I know of have ideas and aspirations pertaining to a
DevOps-style deployment process, a.k.a Continuous Deployment. In recent
times a number of pieces of such a system have become functional: Zuul,
Jenkins enhancements for tests, automated acceptance tests, etc.
But looking at mediawiki.org I don't see any sort of central discussion of
overall approach/design/process for DevOps/Continuous Deployment.
Is it time to start such a discussion? Or is this premature?
On Fri, Jun 15, 2012 at 8:48 AM, Sumana Harihareswara
> If you merge into mediawiki/core.git, your change is considered safe for
> inclusion in a wmf branch. The wmf branch is just branched out of
> master and then deployed. We don't review it again. Because we're
> deploying more frequently to WMF sites, the code review for merging into
> MediaWiki's core.git needs to be more like deployment/shell-level
> review, and so we gave merge access to people who already had deployment
> access. We have since added some more people. The current list:
Let me elaborate on this. As unclear as our process is for giving
access, it's even less clear what our policy is for taking it away.
If we can settle on a policy for taking access away/suspending access,
it'll make it much easier to loosen up about giving access.
Here's the situation we want to avoid: we give access to someone who
probably shouldn't have it. They continually introduce deployment
blockers into the code, making us need to slow down our frequent
deployment process. Two hour deploy windows become six hour deploy
windows as we need time to fix up breakage introduced during the
window. Even with the group we have, there are times where things
that really shouldn't slip through do. It's manageable now, but
adding more people is going to multiply this problem as we get back
into a situation where poorly conceived changes become core
We haven't had a culture of making a big deal about the case when
someone introduces a breaking change or does something that brings the
db to its knees or introduces a massive security hole or whatever.
That means that if the situation were to arise that we needed to
revoke someones access, we have to wait until it gets egregious and
awful, and even then the person is likely to be shocked that their
rights are being revoked (if we even do it then). To be less
conservative about giving access, we also need to figure out how to be
less conservative about taking it away. We also want to be as
reasonably objective about it. It's always going to be somewhat
subjective, and we don't want to completely eliminate the role of
It would also be nice if we didn't have to resort to the nuclear
option to get the point across. One low-stakes way we can use to make
sure people are more careful is to have some sort of rotating "oops"
award. At one former job I had, we had a Ghostbusters Stay Puft doll
named "Buster" that was handed out when someone broke the build that
they had to prominently display in their office. At another job, it
was a pair of Shrek ears that people had to wear when they messed
something up in production. In both cases, it was something you had
to wear until someone else came along. Perhaps we should institute
something similar (maybe as simple as asking people to append "OOPS"
to their IRC nicks when they botch something).
Unlike previous years the big European Hackathon won't be in Berlin, but
in Amsterdam. We're aiming to do the hackathon in May 2013 with a
preference for the weekend of Saturday the 25th. To make sure this is a
good weekend I've set up a straw poll at
Please fill it out so we can finalize the date!
Ps. Please forward to any relevant lists I might have missed.
Sorry, I've replied to Sumana directly instead of the mailing list. So
now duplicating into the mailing list.
Sumana Harihareswara писал 2012-12-19 22:30:
> Try these tips:
Sumana, it's all very good but:
1) I think it's not so comfortable to push other developers personally
when adding them as the reviewers... And I don't know whom to add as the
reviewer, so I just choose randomly. But what if that guy doesn't want
to do review for that extension? For example what if he is already very
busy in working on mediawiki _core_, and I ask him to review a trivial
2) Who can verify changes in extensions? There is no CI. So, people who
can verify changes and people who can put +2 - are they the same people?
But it again leads to short-circuiting all the work to the "core"
people, and aren't they already busy? (I assume they are as they don't
review all the changes)
3) As a solution, I think it would be good if - at least in
not-so-important-as-the-core extensions - the changes merged
automatically after getting, for example, 2x "+1"... Or will you end up
with changes reviewed by not merged by anyone? And also, maybe it would
also be good if the system automatically added some reviewers - randomly
or based on some "ownership" rules...