tl;dr: I am going to break your workflow and your wiki. Skip to the
last section and read on.
== Background ==
As you know, I've been working on a GSoC project to better separate
skins and core MediaWiki . Moving Vector and MonoBook to separate
repositories is the final step of it, and it's exactly as scary as it
Until recently MediaWiki was heavily interconnected with the core
skins, particularly Vector – right now it is only slightly
interconnected and I have some patches pending to make it not so
at all .
== The plan ==
* Fix up some remaining issues with Vector being required
(I have already fixed most)
* Stop always loading MonoBook and Vector
[patch pending: https://gerrit.wikimedia.org/r/148509 + dependencies]
* Use a special fallback when no skins are installed
[patch pending: https://gerrit.wikimedia.org/r/148508]
* Move MonoBook and Vector to separate repositories
* Ship MonoBook, Vector and some more skins with the installer tarball
* Enable them during the installation
[patch merged: https://gerrit.wikimedia.org/r/138652]
== What it means for you ==
If you're upgrading a wiki or you're a developer working on master,
THIS IS A BACKWARDS INCOMPATIBLE CHANGE. Your wiki will continue to
work, it will just look ugly (no styles).
When we do this and you upgrade, you're going to get a big helpful
message asking you to install and enable some skins and explaining how
to do this (see https://gerrit.wikimedia.org/r/148508 for implementation;
basically, put the files/repository under skins/ and require_once in
Try it out with this patch: https://gerrit.wikimedia.org/r/148509
After you do this, you'll be able to switch to slightly older
MediaWiki versions without things breaking, as the new skins will work
with old MediaWiki (the new directory names are different… unless
you're using a case-insensitive OS).
I hope this sounds reasonable to everyone, and I hope to have this
done around the time of Wikimania (possibly during it; I'll be there).
I don't think there is a less disruptive way to do this, other than
not doing it at all; if you come up with one, please share it.
For those who run one of our 76(!) approved OAuth apps, or are using
OAuth extension on their own wiki..
We have a patch  from Mitar to allow OAuth apps to pass a
configurable callback during the OAuth handshake. This will probably
make a lot of app author's lives easier, but can also open up a couple
avenues of abuse-- specifically, it's needed for covert redirect
attacks . If OAuth app authors chose loose callback requirements,
which we can assume will happen if we make approvals automatic (bug
65750), and we ever allow public consumers (huggle was asking for that
for a long time), then it would be possible for attackers to abuse our
So far, I've been really conservative about how we use OAuth (there
are two other features we would have to enable to make this attack
likely). I'd like to hear other's thoughts about:
* Assuming we implement one or two of: dynamic callbacks, automatic
approval of apps, or public consumers, but not all three, which are
* If we do implement all three, we can limit how the callback can
differ from what is registered. I put some suggestions on the gerrit
patch, but would that cause more confusion than help?
 - https://gerrit.wikimedia.org/r/153983
 - http://tetraph.com/covert_redirect/oauth2_openid_covert_redirect.html
this is a notice that on 27th August between 20:00-22:00 UTC we will release maintenance updates for current and legacy branches of the MediaWiki software. Downloads and patches will be available at that time.
Wiki Release Team
We have a mediawiki/vendor repository that holds third party libraries.
Since MediaWiki core is going to eventually rely on them I have crafted
a new Jenkins job (mediawiki-vendor-integration) which clones both
repositories, checkout the appropriate patch / branch and run the whole
MediaWiki PHPUnit test suite.
The job is triggered when a patch is proposed on either mediawiki/core
or mediawiki/vendor but only for the master and REL* branches for now.
I have made it non voting.
The job takes roughly 6 minutes that will slightly delay the report to
If the job looks fine. The next steps are:
- make it pass on wmf branches
- trigger it run on CR+2
Then, I will be able to phase out the grouped PHPUnit groups and the
sqlite installer tester.
Long term: have the tests to run using HHVM as well.
Antoine "hashar" Musso
gerrit.wikimedia.org is a default git gateway for all of wikimedia
projects, but it still has a number of issues compared to other (IMHO
better) providers, like GitHub.
One of major issues I am now having is, that git needs to be accessed
using non-standard port because regular ssh access is still being
served on port 22. This is a problem on restricted networks where
random ports are prohibited and blocked for various reasons, while
standard ports like 21, 22, 80 etc are open.
I propose to move the current ssh port (22) to some non-standard port,
as staff which needs to use it is far smaller than developer community
that needs to access git (the server itself is for git only) and open
port 22 as a gerrit (git) port. The current port 29418 can be used as
well, so that nobody needs to update their repo's which were already
This is technically possible and I think it would make usage of gerrit
much less of a pain in the ****.
Thanks for your considerations :P
For my Google Summer of Code project, I worked on improving the way
delivery lists are stored and modified in the MassMessage extension. To
that end, I have implemented a ContentHandler and JSON-based backend for
storing delivery lists, in addition to frontend for creating and managing
Compared to my stated deliverables, I have largely accomplished what I set
out to do, with the following being the larger differences:
* For editing lists, instead of an interface similar to the prototype
more Wikidata-like UI for quick additions and removals.
* Instead of an API for adding or removing own talk pages from delivery
lists, I worked on a more general API for adding and removing pages.
The code has yet to be merged or deployed, pending some final code review
and database changes for WMF wikis, but it should be doable in the near
future. In the meantime, there is a Labs instance set up running the new
code; please try it out and provide any feedback.
I’d like to thank Legoktm for being an extremely helpful and responsive
mentor, Prtksxna for his UX guidance and help, and MZMcBride for testing
the new features and pointing out issues and things to improve. Working on
the project has been a valuable experience, and I appreciate all the
assistance I received that helped make it a success.