The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
I think it should be possible to make a regular schedule for
when these dumps should be produced, e.g. once each month or
once every second month, and treat any delay as a bug. The
process to produce them has been halted by errors many times
in the past, and even when it runs as intended the interval
is unpredictable. Now when there is a bug, all dumps are
halted, i.e. much delayed. For a user of the dumps, this is
extremely frustrating. With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
This drag and drop from a CSV and other mentions of VE neatness would make a great blog post. I would love see a post about being productive in VisualEditor. Tips and tricks as it were.
This electronic mail and any attached documents are intended solely for the named addressee(s) and contain confidential information. If you are not an addressee, or responsible for delivering this email to an addressee, you have received this email in error and are notified that reading, copying, or disclosing this email is prohibited. If you received this email in error, immediately reply to the sender and delete the message completely from your computer system.
management usage in MediaWiki. I am basically talking about an equivalent
for JS to what we have with Composer for PHP.
The "data-values/value-view" package is defining
Now, recently I needed the same functionality in one of my extensions, so
I just copied it over. 
I know that this is the worst way one could do this, but as far as I can
see we don't have that much of a choice right now. Here are the alternative
options I can see:
Moving "jquery.event.special.eachchange.js" out of the
"data-values/value-view" package into its own "WMDE/jquery-eachchange"
1. ... and using it in my extension via composer.
+ pro: two or more extensions or other packages requiring this package
will still result in having only one MW-wide installation..
- con: requires MW specific code which is actually not related to the
MW-independent package to feed the resource loader.
2. ... and having a build step in other packages using the package, pulling
the "WMDE/jquery-eachchange" somewhere into the file structure of the
packages/extensions using it.
+ pro: don't need to abuse composer, we can use "npm", "Bower" or any
other arbitrary JS package manager here.
- con: got to tell resource loader somehow... (didn't think so much about
- con: if more than one extensions or other packages require this package
we still end up with the same code twice or more often in one MW
3. Combining 1 and 2: Start with 2, using a JS package manager. Then going
to 1, creating a composer package and pulling the "WMDE/jquery-eachchange"
package in via some build script.
+ pro: The two pros from 1 + 2
+ pro: ^^
- con: still got to tell resource loader somehow...
- con: Overhead; We now create two packages where the Composer one is
just a bridge to the MW-world, still polluting packagist.org. Still kind of
ugly and more effort for publishing a package and therefore potentially
scaring programmers away from doing so since they've got better things to
do than doing work that could be automated.
I have not seen Approach 2 and 3 yet. Though I could imagine that the
VisualEditor team has used something like that.
Approach 1 is the way the "data-values/value-view" package itself is being
handled. And that package should actually be a MW independent pure JS
package but right now it contains MW specific code and uses composer for
There is still another option but that had to be properly implemented:
4. Choose one native JS package manager for now and go with it (add support
for others later perhaps). Integrate it properly with MW (resource loader
to begin with), document how to use it and finally distribute JS code
coming from the MW world but useful for other projects in a way where it
can actually be used in a non-MW context.
This has already been bugging me when working on Wikidata. Now I'd like to
reuse some of the code I have written there without spending hours and
hours with option 3 because there should be support for option 4 rather
sooner or later.
So I am wondering; Does anyone have any thoughts, any alternatives perhaps
or is there any roadmap on anything like the option 4 that I have shown?
tl;dr trying to make mw-core pass mw-codesniffer, expect large patches on
Gerrit, and please help
A lot of work has been done on MediaWiki codesniffer
PHP_CodeSniffer standard for MediaWiki) over the last few months and this
might be a good time to get core to pass our coding conventions
Work on this has already started at T102609
<https://phabricator.wikimedia.org/T102609>. There were two primary reasons
to send this email:
1. This is a pretty big task and any help would be very welcome! If we can
get phpcs to run against core's master, it would make every patch
contributors' work a little easier.
2. Work is being organized as subtasks of T102609, and has been divided on
the basis of sniffs. Because of this, every patch is going to change a lot
of unrelated files and a lot of reviewers are going to get added on Gerrit.
Let's make this happen!
Vivek Ghaisas (polybuildr)
Are there any stable APIs for an application to get a parse tree in
machine-readable format, manipulate it and send the result back without
I'm sorry if this question doesn't make any sense.
There's a pretty hilarious American police procedural TV show in 2015
called "CSI: Cyber", featuring mostly cybercrime. Obviously they have to
dredge up snippets of code from places for screenshots on the show.
Episode 4 happened to include a tidbit from MediaWiki 1.25/wmf3. Supposedly
the code was a hack to make your printer blow up.
Original lulz and screenshots via
We haven't announced this survey on wikitech-l yet, so if you run a wiki
outside of those run by the WMF, please take the time to fill out the
survey at <http://hexm.de/MWSurvey>. More information about the survey
and its purpose can be found at
That said, we received a report (T104010) from the Analytics team today
that most downloads of MediaWiki are coming from China. The report
indicated there were twice as many downloads from China as the U.S. in
I don't know of a good way to reach these users -- I wasn't even aware
of them till today. Through our efforts at outreach so far we've
uncovered a number of private wikis that we wouldn't have been able to
discover otherwise, but I'd like to extend our reach even farther.
Can we add a link to the survey to the top of
https://www.mediawiki.org/wiki/Download until the end of July?
Mark A. Hershberger