I ran into an issue trying to save MediaWiki:Common.js I get this page not available (index.php?title=MediaWiki:Common.js&action=submit)
MediaWiki:Common.css save fine
Any Idea what I'm doing wrong?
The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
I think it should be possible to make a regular schedule for
when these dumps should be produced, e.g. once each month or
once every second month, and treat any delay as a bug. The
process to produce them has been halted by errors many times
in the past, and even when it runs as intended the interval
is unpredictable. Now when there is a bug, all dumps are
halted, i.e. much delayed. For a user of the dumps, this is
extremely frustrating. With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
This drag and drop from a CSV and other mentions of VE neatness would make a great blog post. I would love see a post about being productive in VisualEditor. Tips and tricks as it were.
This electronic mail and any attached documents are intended solely for the named addressee(s) and contain confidential information. If you are not an addressee, or responsible for delivering this email to an addressee, you have received this email in error and are notified that reading, copying, or disclosing this email is prohibited. If you received this email in error, immediately reply to the sender and delete the message completely from your computer system.
I see that there's an active workboard in Phabricator at
https://phabricator.wikimedia.org/project/board/225/ for CAPTCHA issues.
Returning to a subject that has been discussed several times before: the
last I heard is that our current CAPTCHAs do block some spambots, but they
also present problems for humans and aren't especially difficult for more
sophisticated spambots to solve. Can someone share a general update on
what's happening with regard to improving usability for humans and
increasing the difficulty for bots? I'm particularly concerned about the
former issue, since CAPTCHAs might be filtering out some good-faith human
management usage in MediaWiki. I am basically talking about an equivalent
for JS to what we have with Composer for PHP.
The "data-values/value-view" package is defining
Now, recently I needed the same functionality in one of my extensions, so
I just copied it over. 
I know that this is the worst way one could do this, but as far as I can
see we don't have that much of a choice right now. Here are the alternative
options I can see:
Moving "jquery.event.special.eachchange.js" out of the
"data-values/value-view" package into its own "WMDE/jquery-eachchange"
1. ... and using it in my extension via composer.
+ pro: two or more extensions or other packages requiring this package
will still result in having only one MW-wide installation..
- con: requires MW specific code which is actually not related to the
MW-independent package to feed the resource loader.
2. ... and having a build step in other packages using the package, pulling
the "WMDE/jquery-eachchange" somewhere into the file structure of the
packages/extensions using it.
+ pro: don't need to abuse composer, we can use "npm", "Bower" or any
other arbitrary JS package manager here.
- con: got to tell resource loader somehow... (didn't think so much about
- con: if more than one extensions or other packages require this package
we still end up with the same code twice or more often in one MW
3. Combining 1 and 2: Start with 2, using a JS package manager. Then going
to 1, creating a composer package and pulling the "WMDE/jquery-eachchange"
package in via some build script.
+ pro: The two pros from 1 + 2
+ pro: ^^
- con: still got to tell resource loader somehow...
- con: Overhead; We now create two packages where the Composer one is
just a bridge to the MW-world, still polluting packagist.org. Still kind of
ugly and more effort for publishing a package and therefore potentially
scaring programmers away from doing so since they've got better things to
do than doing work that could be automated.
I have not seen Approach 2 and 3 yet. Though I could imagine that the
VisualEditor team has used something like that.
Approach 1 is the way the "data-values/value-view" package itself is being
handled. And that package should actually be a MW independent pure JS
package but right now it contains MW specific code and uses composer for
There is still another option but that had to be properly implemented:
4. Choose one native JS package manager for now and go with it (add support
for others later perhaps). Integrate it properly with MW (resource loader
to begin with), document how to use it and finally distribute JS code
coming from the MW world but useful for other projects in a way where it
can actually be used in a non-MW context.
This has already been bugging me when working on Wikidata. Now I'd like to
reuse some of the code I have written there without spending hours and
hours with option 3 because there should be support for option 4 rather
sooner or later.
So I am wondering; Does anyone have any thoughts, any alternatives perhaps
or is there any roadmap on anything like the option 4 that I have shown?
tl;dr trying to make mw-core pass mw-codesniffer, expect large patches on
Gerrit, and please help
A lot of work has been done on MediaWiki codesniffer
PHP_CodeSniffer standard for MediaWiki) over the last few months and this
might be a good time to get core to pass our coding conventions
Work on this has already started at T102609
<https://phabricator.wikimedia.org/T102609>. There were two primary reasons
to send this email:
1. This is a pretty big task and any help would be very welcome! If we can
get phpcs to run against core's master, it would make every patch
contributors' work a little easier.
2. Work is being organized as subtasks of T102609, and has been divided on
the basis of sniffs. Because of this, every patch is going to change a lot
of unrelated files and a lot of reviewers are going to get added on Gerrit.
Let's make this happen!
Vivek Ghaisas (polybuildr)
I'm happy to announce a Gerrit Cleanup Day on Wed, September 23.
It's an experiment to reduce Wikimedia's code review backlog which
hurts growing our long-term code contributor base.
Development/engineering teams of the Wikimedia Foundation are supposed
to join and use the day to primarily review recently submitted open
Gerrit changesets without a review, focussing on volunteer
contributions. And developers of other organizations and individual
developers are of course also very invited to join and help! :)
https://phabricator.wikimedia.org/T88531 provides more information,
steps, links. Note it's still work in progress.
Your questions and feedback are welcome.
Andre Klapper | Wikimedia Bugwrangler