Yesterday there was a conversation about code review on irc and among other
things, how sometimes patches can get "stuck".
I had an idea for a way to improve things. I'm not sure if it is a good
idea, but there's only one way to find out.
So without further ado, announcing the Code Review Patch Board:
In short - each person is allowed to list one of their patches on the board
that they would really like to see reviewed. You can only list one patch at
a time, and it should be a patch that you have been unable to get review
for for at least a week through normal means. See the page for the full
list of guidelines.
I encourage people to give it a try. Add a patch you wrote that you cannot
get a review for. Or if you have +2 rights, try giving some love to these
I would also love to hear feedback on the general idea as well as the
To repeat, the url is:
TL;DR: The legacy Mobile Content Service is going away in July 2023. Please
switch to Parsoid or another API before then to ensure service continuity.
I'm writing about a service decommission we hope to complete mid-July 2023.
The service to be decommissioned is the legacy Mobile Content Service
("MCS"), which is maintained by the Wikimedia Foundation's Content
Transform Team. We will be marking this service as deprecated soon.
We hope that with this notice, people will have ample time to update their
systems for use of other endpoints such as Parsoid  (n.b., MCS uses
The MCS endpoints are the ones with the relative URL path pattern
/page/mobile-sections* on the Wikipedias. For examples of the URLs see the
"Mobile" section on the online Swagger (OpenAPI) specification
documentation with matching URLs here:
== History ==
The Mobile Content Service ("MCS") is the historical aggregate service that
originally provided support for the article reading experience on the
Wikipedia for Android native app, as well as some other experiences. We
have noticed that there are other users of the service. We are not able to
determine all of the users, as it's hard to tell with confidence from the
The Wikimedia Foundation had already transitioned the Wikipedia for
Android and iOS apps to the newer Page Content Service ("PCS") several
years ago. PCS has some similarities with MCS in terms of its mobility
focus, but it also has different request-response signatures in practice.
PCS, as with MCS, is intended to primarily satisfy Wikimedia
Foundation-maintained user experiences only, and so this is classified with
the "unstable" moniker.
== Looking ahead ==
Generally, as noted in the lead, we recommend that folks who use MCS (or
PCS, for that matter) switch over to Parsoid for accessing Wikipedia
article content programmatically for the most predictable service.
The HTML produced by Parsoid has a versioned specification  and because
Parsoid is accessed regularly by a number of components across the globe
tends to have fairly well cached responses. However, please note that
Parsoid may be subject to stricter rate limits that can apply under certain
At this point, I do also want to note that in order to keep up with
contemporary HTML standards, particularly those favoring accessibility and
machine readability enhancements, Parsoid HTML will undergo change as we
further converge parsing stacks . Generally, you should expect iteration
on the Parsoid HTML spec, and of course as you may have come to appreciate
that the shape of HTML in practice can vary nontrivially wiki-by-wiki as
practices across wikis vary.
You may also want to consider Wikimedia Enterprise API options, which range
from no cost to higher volume access paid options.
== Forking okay, but not recommended ==
Because MCS acts as a service aggregate and makes multiple backend API
calls, caveats can apply for those subresources - possibility of API
changes, deprecation, and the like. We do not recommend a plain fork of MCS
code because of the subresource fetch behavior. This said, of course you
are welcome to fork in a way compatible with MCS's license.
== Help spread the word ==
Although we are aware of the top two remaining consumers of MCS, we also
are not sure who else is accessing MCS and anticipate that some downstream
tech may break when MCS is turned off. As we are cross-posting this
message, we hope most people who have come to rely upon MCS will see this
message. Please feel free to forward this message to contacts if you know
they are using MCS.
== Help ==
Although we intend to decommission MCS in July 2023, we would like to share
resources if you need some help. We plan to hold office hours in case you
would like to meet with us to discuss this or other Content Transform Team
matters. We will host these events on Google Meet. We will provide notice
of these office hours on the wikitech-l mailing list in the coming weeks
Additionally, if you would like to discuss your MCS transition plans,
please visit the Content Transform Team talk page:
Finally, some Content Transform Team members will also be at the Wikimedia
Hackathon  if you would like some in-person support.
Adam Baso (he/him/his/Adam), on behalf of the Content Transform Team
Director of Engineering
I'm proposing we introduce a new namespace on mediawiki.org.
The namespace name will be Archived (numerical id to be determined) and its
purpose will be to hold pages like "Subversion" that have the template
"historical" applied to it. These pages would move info that namespace and
you would get Archived:Subversion, Archived:Manual:Small padlock icon,
This will give us a place outside of the main content namespaces to keep
information about configurations, manuals, extensions and skins that we
want to keep, but where it will no longer pollute our set of currently
relevant information. The namespace will not be part of
$wgContentNamespaces and $wgNamespacesToBeSearchedDefault namespaces.
Hopefully this will allow for a more searchable and better functioning
mediawiki.org when it comes to documentation for new users, while
preserving history in the spirit of wikis.
I've discussed this with various people at various points in time who all
seemed to think this was a good idea, but we've never really had an open
discussion about it that could result into action.
Please take our annual* *Developer Satisfaction Survey*!
The survey is open until Fri, 17 Feb 2023—two weeks from today.
This survey is for members of the *Wikimedia Developer Community* and
covers the following topics:
Code review tooling and process
MediaWiki development environments
Beta cluster / Staging
Please take the survey if you’ve used the above tools as part of your role
developing software for the Wikimedia community.
We’re soliciting your feedback to:
Measure developer satisfaction, and
determine where to invest resources in the future
We will anonymize, explore, and report the data we gather on mediawiki.org.
View previous years' survey results:
Privacy statement: This survey will be conducted via a third-party service,
which may subject it to additional terms. For more information on privacy
and data-handling, see the survey privacy statement
Tyler Cipriani (he/him)
Engineering Manager, Release Engineering
*: “annual,” except we missed 2022 🙁
As part of work to make storage of external links in MediaWiki continue to
scale without risking site stability (T312666
<https://phabricator.wikimedia.org/T312666>), we are deprecating most of
the special functionalities around proto-relative URLs (URLs that start
with // instead of https:// or http://).
Proto-relative URLs were beneficial a decade ago, when Wikimedia projects
were being served in both encrypted and unencrypted traffic (http and
https). However, since 2015, all of our traffic has been served encrypted
only, and this functionality doesn’t provide much user benefit any more for
Wikimedia wikis. With HTTP/2, a similar circumstance applies to all
As well as being low-value, our external links storage (the externallinks
table) has grown to be one of the biggest tables for each production wiki.
This is due to many duplications of URL information, added to serve
different use cases. With the changes, we are removing these duplications,
and some of the functionality. You can read more about the work in T312666
Storage of proto-relative URLs has changed to only store HTTPS URLs
Previously, if a proto-relative was added in an edit, MediaWiki internally
treated it as two links one with http:// and one with https://. From this
week forward, for all Wikimedia wikis, the storage will change to store
only https:// URLs. Once those wikis are switched to read the new database
schema, the links will be presented as https only in Special:LinkSearch and
their API counter-parts. This means effectively a proto-relative external
link will be treated like a HTTPS one. This change will also apply to
non-Wikimedia wikis using MediaWiki 1.41+.
expandurl option is deprecated and ignored in the exturlusage and extlinks
MediaWiki action API modules
This means “expandurl” argument in exturlusage and extlinks
<https://www.mediawiki.org/wiki/API:Extlinks> API modules will be ignored
and proto-relative URLs will be always expanded to HTTPS. This will happen
any time a wiki is switched to read from the new externallinks fields. (You
can track the progress in T335343
<https://phabricator.wikimedia.org/T335343>) This change will also apply to
non-Wikimedia wikis using MediaWiki 1.41+.
If your wiki heavily uses proto-relative URLs in articles' wikitext, we
recommend changing them to https instead which also improves storage as
every proto-relative URLs takes up two rows.
Amir Sarabadani, Staff Database Architect
James Forrester, Staff Software Engineer
Timo Tijhof, Principal Performance Engineer
The maps and aticle image have been good additions to the multimedia
capabilities in Wikipedia in the last decade, widely used on my home wiki.
For now, in Wikipedia the maps are also rendered as images and become
interactive only when clicked on. This makes them potential candidates for
being displayed as article images in article without a photo.
I would like to find out how complicated it would technically be to make
the article image extension also capable of using map images? I know this
is probably nowhere on the roadmap, I'm only interested in the technical
part of the idea.
For about a week MediaWiki Codesearch was serving outdated and
incomplete results (and later partially down outright) because of a full
disk. This has been fixed now, you may wish to re-run searches as necessary.
The full disk was accidentally caused by switching between gerrit and
gerrit-replica (as previously discussed on this list), so I
deleted all the old gerrit-replica repository copies and also added ~20G
more disk space to the instance. Monitoring did correctly pick up that
there was some issue once the disk was full, it just wasn't properly
examined because of hackathon travel / activities.
-- Kunal / Legoktm