Following the recent outage, we've had a new series of complaints
about the lack of improvements in CX, especially related to
server-side activities like saving/publishing pages.
Now, I know the team is involved in a long-term effort to merge the
editor with the VE, but is there an end in sight for that effort? Can
I tell people who ask "look, 6 more months then we'll have a much
better translation tool"?
Is there a publicly available roadmap for this project and more
generally, for CX?
(Reposting https://www.mediawiki.org/wiki/Topic:Tyvfh19mba4pway9 here to
garner more input.)
I'm working on Extension:GlobalPreferences and trying to figure out how
best to do things with all preferences, after they've been defined (in
order to show various extra Preferences-form bits and pieces that depend
on knowing about all preferences). At the moment, we're using
$wgExtensionFunctions and hacking the $wgHooks global to add a new
callback at the end of $wgHooks['GetPreferences'].
One idea is to add a new MediaWiki service called 'PreferencesFactory',
that can be used to retrieve a new Preferences object. Extensions would
then be able to use the MediaWikiServices hook to redefine the
PreferencesFactory (with MediaWikiServices::redefineService()). Of
course, only one extension would be able to do that (which maybe is a
Apart from being able to override the Preferences class, a service for
this would also mean the Preferences class could be refactored
(gradually?) to not be such a collection of static methods.
The proposed patch is: https://gerrit.wikimedia.org/r/#/c/374451/
I'd love to hear anyone's ideas about this, including completely
different and better ways to do things. :-)
Another idea is to add a new hook, after GetPreferences. This wouldn't
be as flexible as the PreferencesFactory idea, but is a lot simpler.
The voting phase of the 2017 Community Wishlist Survey has now started.
Read the proposals and support the ones you want to support to make the
Click on the categories to find the proposals. The voting will close on
That's the important part of this email. Feel free to follow the link above
and starting voting right now.
The longer version:
The Community Wishlist Survey decides what the Wikimedia Foundation
Community Tech team will work on over the next year. The team is responsible
for addressing the top 10 wishes on the list, as well as some wishes from
smaller groups and projects that are doing important work, but don't have
the numbers to get their proposal into the top 10. The Wishlist is also
used by volunteer developers and other teams, who want to find projects to
work on that the community really wants.
Come help set the agenda.
If you want to see what the team has done in 2017, see the status report
from last month:
What you can do now:
*) Vote. This is the most important thing.
*) Spread the word. We really want people to find this, of course, and
we'll work on finding the best balance between spreading the news to
everyone and not being annoying, but please do help to spread the
information in your local community – Village Pump equivalents, IRC
channels, social media groups and so on.
*) Help translating the pages. We want the process to be as available as
possible for everyone. It's not every available if it's only in English.
*) If you want to get short updates through the notification system, you
can sign up for the Community Tech Newsletter:
I'm working on the database schema for Multi-Content-Revisions (MCR)
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Database_Schema> and I'd
like to get rid of the rev_sha1 field:
Maintaining revision hashes (the rev_sha1 field) is expensive, and becomes more
expensive with MCR. With multiple content objects per revision, we need to track
the hash for each slot, and then re-calculate the sha1 for each revision.
That's expensive especially in terms of bytes-per-database-row, which impacts
So, what do we need the rev_sha1 field for? As far as I know, nothing in core
uses it, and I'm not aware of any extension using it either. It seems to be used
primarily in offline analysis for detecting (manual) reverts by looking for
revisions with the same hash.
Is that reason enough for dragging all the hashes around the database with every
revision update? Or can we just compute the hashes on the fly for the offline
analysis? Computing hashes is slow since the content needs to be loaded first,
but it would only have to be done for pairs of revisions of the same page with
the same size, which should be a pretty good optimization.
Also, I believe Roan is currently looking for a better mechanism for tracking
all kinds of reverts directly.
So, can we drop rev_sha1?
Principal Platform Engineer
Gesellschaft zur Förderung Freien Wissens e.V.
Handling of usernames in imported edits in MediaWiki has long been weird
(T9240 was filed in 2006!).
If the local user doesn't exist, we get a strange row in the revision table
where rev_user_text refers to a valid name while rev_user is 0 which
typically indicates an IP edit. Someone can later create the name, but
rev_user remains 0, so depending on which field a tool looks at the
revision may or may not be considered to actually belong to the
If the local user does exist when the import is done, the edit is
attributed to that user regardless of whether it's actually the same user.
See T179246 for an example where imported edits got attributed to the
wrong account in pre-SUL times.
In Gerrit change 386625 I propose to change that.
- If revisions are imported using the "Upload XML data" method, it will
be required to fill in a new field to indicate the source of the edits,
which is intended to be interpreted as an interwiki prefix.
- If revisions are imported using the."Import from another wiki" method,
the specified source wiki will be used as the source.
- During the import, any usernames that don't exist locally (and can't
be auto-created via CentralAuth) will be imported as an
otherwise-invalid name, e.g. an edit by User:Example from source 'en' would
be imported as "en>Example".
- There will be a checkbox on Special:Import to specify whether the same
should be done for usernames that do exist locally (or can be created) or
whether those edits should be attributed to the existing/autocreated local
- On history pages, log pages, and the like, these usernames will be
displayed as interwiki links, much as might be generated by wikitext like "
[[:en:User:Example|en>Example]]". No parenthesized 'tool' links (talk,
block, and so on) will be generated for these rows.
- On WMF wikis, we'll run a maintenance script to clean up the existing
rows with valid usernames and rev_user = 0. The current plan there is to
attribute these edits to existing SUL users where possible and to prefix
them with a generic prefix otherwise, but we could as easily prefix them
- Unfortunately it's impossible to retroactively determine the actual
source of old imports automatically or to automatically do anything about
imports that were misattributed to a different local user in
- The same will be done for CentralAuth's global suppression blocks.
In this case, on WMF wikis we can safely point them all at Meta.
If you have comments on this proposal, please reply here or on
Background: The upcoming actor table changes require some change to the
handling of these imported names because we can't have separate attribution
to "Example as a non-registered user" and "Example as a registered user"
with the new schema. The options we've identified are:
1. This proposal, or something much like it.
2. All the existing rows with rev_user = 0 would have to be attributed
to the existing local user (if any), and in the future when a new user is
created any existing edits attributed to that name will be automatically
attributed to that new account.
3. All the existing rows with rev_user = 0 and an existing local user
would have to be re-attributed to different *valid* usernames, probably
randomly-generated in some manner, and in the future when a new user is
created any existing edits for that name would have to be similarly
4. Like #2, except the creation (including SUL auto-creation) of the
same-named account would not be allowed. Thus, an import before the local
name exists would forever block that name from being used for an actual
5. Some less consistent combination of the "all the existing rows" and
"when a new user is created" options from #2–4.
Of these options, this proposal seems like the best one.
: ">" was chosen rather than the more typical ":" because the former is
already invalid in all usernames (and page titles). While a colon is *now*
disallowed in new usernames, existing names created before that restriction
was added can continue to be used (and there are over 12000 such usernames
in WMF's SUL) and we decided it'd be better not to suddenly break them.
Brad Jorsch (Anomie)
Senior Software Engineer
The Technical Collaboration team proposes the creation of a developer
support channel focusing on newcomers, as part of our Onboarding New
Developer program. We are proposing to create a site based on Discourse
(starting with a pilot in discourse-mediawiki.wmflabs.org) and to point the
many existing scattered channels there.
This is an initial proposal for a pilot. If the pilot is successful, we
will move it production. For that to happen we still need to sync well with
Wikimedia Cloud Services, Operations and the Wikimedia technical community.
Please check https://www.mediawiki.org/wiki/Discourse and share your
Engineering Community Manager @ Wikimedia Foundation
(sorry for cross-posting ...)
we're really happy to announce that the new AdvancedSearch interface got
deployed as a beta feature to Mediawiki.org and test just 2 hours ago. 
AdvancedSearch enhances Special:Search through an advanced parameters form
and aims to make existing search options more visible and accessible for
The feature is a project by WMDE's Tech team and originates from the
Technical Wishes project. Many other people and teams helped making this
project happen, so stay tuned for the thank you section at the end of this
email :-) 
The Search has great options to perform advanced queries, but often even
experienced editors don't know about it - this is what we found out when we
were conducting a workshop series on advanced search in several cities in
Germany in 2016. Together with contributors from German Wikipedia we've
discussed their desired search queries, explained how the syntax works and
how keywords like "hastemplate", "filetype" or "intitle" can be used and
combined to get the desired results.
The idea for the AdvancedSearch feature results out of these workshops,
where we not only discussed search but also designed first mocks for an
advanced search interface. 
*The first version, more deployments and other next steps*AdvancedSearch
supports some of the special search options the WMF's search team has been
implemented in the last years. The way the interface works, users don't
have to know the syntax behind each search field, but they can learn about
it if they want to. The first version of the feature comes with a first
selection of advanced search options, e.g. including support for
"hastemplate" or "intitle". The WMF's search team has started to work on a
'deepcat' functionality to make sub category search happen. We plan to add
support for this in the future, too.
If all goes well, we plan to deploy the beta feature on German and Arabic
Wikipedia by Wednesday, Nov 29. 
In the next 2-3 months we'd love to invite everyone to test the new feature
on those wikis: Comments, thoughs, bug reports ... - any feedback is much
Please see https://www.mediawiki.org/wiki/Help:AdvancedSearch for how to
Deployments to other wikis can follow later.
*Thank you (finally!)*
This project was developed with the support from many different people from
the start, and this makes it very special to us: We would like to thank the
folks from the German Wikipedia community who participated in the workshops
and who were discussing and designing first ideas together with us. We
would like to thank all other people who gave valuable feedback on
Phabricator, onwiki and in real life. We would like to thank the Arabic
Wikipedia community for getting interested in AdvancedSearch, and for
offering to test this feature in an early stage. The credit for the
development work on AdvancedSearch goes to our FUN team, supported by the
team that usually works on Technical Wishes. Last but really not least we
would like to thank the great folks from the WMF's search team who have
supported this project from the start and who have done (and still do) the
necessary backend work that now got supported by the AdvancedSearch
Sorry for the long email, and thanks for reading :-)
Birgit (for the Technical Wishes team)
 https://phabricator.wikimedia.org/T180147 (Deployment ticket group 0)
 https://phabricator.wikimedia.org/T180128 (Deployment ticket arwiki and
Community Communications Manager
Software Development and Engineering
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
== Callouts ==
* Reminder! This is your last few weeks of deployments for the
year/quarter! Deployment freeze starts the week of December 18th.
* Reminder! This is also the first week of the big fundraising campaign.
Please be extra careful not to break CentralNotice (js/skin changes,
resourceloader changes) or donate.wikimedia.org.
* s8 replica set of servers for wikidata (T177208) to be live on 9th
January- check scripts/config that assume wikidata is on s5!). We will have
a small period of read only that day.
* databases no longer guaranteed to be on port 3306 only (T178359)- check
also scripts that assume default/no port (should be transparent for most
* Your performance metrics (esp first paint) may have changed on Nov 2nd.
This is due to a bug in our performance monitoring that we fixed on that
* Please provide feedback on WikimediaUI (OOUI) icon overhaul at
https://phabricator.wikimedia.org/M229 – plan to settle on set by next week
* Last name of thiemowmde changed, can I get an LDAP update?
* Special:Preferences was migrated to OOjs UI in wmf.10 (this week's
train), this changes the prefs UI substantially
== Audiences ==
=== Readers ===
==== iOS native app ====
* Blocked by:
* Updates: still working on 5.8
==== Reading Infrastructure ====
* Blocked by:
** working on Reading Lists performance
** continuing work on media, summary, and references endpoints
** considering undeploying trending-edits service
==== Web ====
* Blocked by:
** Business as usual (Chromium-based PDF service)! Next up: Performance
** Started work on updating/tidying the design of special pages on mobile:
==== Multimedia ====
* Blocked by: N/A
* Blocking: N/A
** MP3 uploads enabled Soon™ on Commons
** 3D product nearly ship-ready, heads-up for deploy coming at some point
** Wikibase/MediaInfo work for SDoC is progressing slowly
===== Maps =====
* Blocked by: None
* Blocking: None
** Ops person back this week
===== Discovery =====
* Blocked by: None
* Blocking: None
** automated Jenkins build for wikipedia.org portal working well
** updating documentation
=== Contributors ===
==== Global Collaboration ====
* Blocked by: nobody
* Blocking: ops for Flow dumps (talked to Ariel and came to an agreement,
but didn't have time to work on it last week with the holidays)
** RCFilters "live update" graduating out of beta in this week's train
** On behalf of Editing: Special:Preferences migration to OOjs UI in this
week's train, UI of Special:Preferences will change substantially
==== Parsing ====
* <section> tag code being tested.
** Code in beta cluster.
** Some fixes in progress based on tests. Unlikely to be deployed this
week, unless we get everything merged and tested today.
=== Community Tech ===
* Blocked by TechCom on global preferences
* Not blocking
* Our wishlist survey is in voting stage
==== UI Standardization ====
* Blocked: icons overhaul, pls see below. New iteration got out end of
second-last week, plan to settle by end of week – pls provide feedback
** OOUI v0.24.3 released
7 new features, among those:
*** Allow adding virtual viewport spacing (Bartosz Dziewoński)
*** DropdownInputWidget: Generate a hidden `<select>` in JS (Bartosz
*** MenuSelectWidget, PopupWidget: Automatically change popup direction if
there is no space (Bartosz Dziewoński)
4 design fixes and removing Opera < 15 special treatment
** OOUI & based products:
*** icons: Work on icon set to be more harmonious and align to WikimediaUI
Style Guide https://phabricator.wikimedia.org/T177432
- Provide specific feedback at https://phabricator.wikimedia.org/M229
*** Make Special:Preferences use OOUI – related UI/UX amendments
** Unify SVG markup across Foundation products
== Technology ==
==== Analytics ====
* Blocked by:
** Wikistats vetting almost complete, announcing next week
** Prometheus driven metrics on druid -
** Working on jsonrefine on eventlogging so eventlogging data that was up
to now only available on MYSQL is available on Hadoop
** Productionising Superset
master data still being worked on.
==== Services ====
* Not attending personally
* Blockers: none
** Still reshaping cassandra cluster to migrate to Cassandra 3
** wikibase-addUsagesForPage job switched to kafka queue
=== Performance ===
** Callout above re: performance metrics changing on Nov 2nd.
** Working with Mozilla on some perf regressions related to FF57
** Working on identifying some issues related to Chrome 62
=== Release Engineering ===
** [SSD] blocked on ops updating nodejs-devel base image: <
** [MW Train] Reminder! This is your last few weeks of deployments for the
year/quarter! No non-emergency deploys starts the week of December 18th.
** [MW Train] The post mortem for T181006 (Watchlist and RecentChanges
failure due to ORES on frwiki and ruwiki) is scheduled for December 7th.
** [nodejs browser tests] CirrusSearch and Mobile are active with
** [nodejs browser tests] Investigation of using mwbot instead of nodemw
(maintained by a Wikia engineer).
** [ruby] We will upgrade rubocop (Ruby linter/static analyzer) across the
corpus of extensions due to a security issue (pointed out by the new Github
security issue notification service). We (WMF production and CI) are not
affected by the issue AND we have officially deprecated our ruby browser
test stack (where 99% of all rubocup uses/dependencies are) BUT we’ll do it
** [techdebt] Wider conversations regarding SLAs/stewardship of
** [techdebt] 2nd of the 3 tech debt series of blog posts (after the 0th
introduction one on Code Health) is in drafting.
** [CI] We’ve migrated almost all of the tox (python) jobs to the new
container based CI infra.
=== Scoring Platform ===
* Blocked by:
** scap and network issues preventing us from continuing to test new ORES
** We've been having fun with heavy outages for the last day, seems to have
been stabilized a few hours ago.
=== Search Platform ===
* Blocked by: none
* Blocking: none
* Merged WIkidata description indexing, starting reindex after deploy
* Finished review of Serbian morphological libraries, some promise but
needs bugfixes https://phabricator.wikimedia.org/T178926#3790458
* Working on improvements to LTR training
* Improving completion suggester interaction with namespaced prefix search
* Working on porting Selenium tests from Ruby to JS
* Working on upgrade to Elastic 5.5
* Working on Wikidata fulltext search
=== Technical Operations ===
** [Releng] blocked on ops updating nodejs-devel base image: <
** s8 replica set of servers for wikidata (
https://phabricator.wikimedia.org/T177208 ) to be live on 9th January-
check scripts/config that assume wikidata is on s5!). We will have a small
period of read only that day.
** databases no longer guaranteed to be on port 3306 only (
https://phabricator.wikimedia.org/T178359 )- check also scripts that assume
default/no port (should be transparent for most people)
** service-running was spamming statsd, incident report
== Wikidata ==
* Fiddling with the concept of "sub-entities" in the Wikibase storage
* We are aware of the blocker for the MediaInfo team, planning to tackle it
this week: https://phabricator.wikimedia.org/T177022
* Last name of thiemowmde changed, can I get an LDAP update?
=== Fundraising Tech ===
* Banners are up! Watching the money roll in
* Mostly working on visualizations - our funky internal dashboard and more
data for grafana
* Helping fr-not-tech get good info, debugging campaign configuration
* Deployed one low-level CiviCRM thing to deal better with database
=== MediaWiki Platform ===
* Continuing on schema changes for MCR, etc...