On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
Following the recent outage, we've had a new series of complaints
about the lack of improvements in CX, especially related to
server-side activities like saving/publishing pages.
Now, I know the team is involved in a long-term effort to merge the
editor with the VE, but is there an end in sight for that effort? Can
I tell people who ask "look, 6 more months then we'll have a much
better translation tool"?
Is there a publicly available roadmap for this project and more
generally, for CX?
(Reposting https://www.mediawiki.org/wiki/Topic:Tyvfh19mba4pway9 here to
garner more input.)
I'm working on Extension:GlobalPreferences and trying to figure out how
best to do things with all preferences, after they've been defined (in
order to show various extra Preferences-form bits and pieces that depend
on knowing about all preferences). At the moment, we're using
$wgExtensionFunctions and hacking the $wgHooks global to add a new
callback at the end of $wgHooks['GetPreferences'].
One idea is to add a new MediaWiki service called 'PreferencesFactory',
that can be used to retrieve a new Preferences object. Extensions would
then be able to use the MediaWikiServices hook to redefine the
PreferencesFactory (with MediaWikiServices::redefineService()). Of
course, only one extension would be able to do that (which maybe is a
Apart from being able to override the Preferences class, a service for
this would also mean the Preferences class could be refactored
(gradually?) to not be such a collection of static methods.
The proposed patch is: https://gerrit.wikimedia.org/r/#/c/374451/
I'd love to hear anyone's ideas about this, including completely
different and better ways to do things. :-)
Another idea is to add a new hook, after GetPreferences. This wouldn't
be as flexible as the PreferencesFactory idea, but is a lot simpler.
The voting phase of the 2017 Community Wishlist Survey has now started.
Read the proposals and support the ones you want to support to make the
Click on the categories to find the proposals. The voting will close on
That's the important part of this email. Feel free to follow the link above
and starting voting right now.
The longer version:
The Community Wishlist Survey decides what the Wikimedia Foundation
Community Tech team will work on over the next year. The team is responsible
for addressing the top 10 wishes on the list, as well as some wishes from
smaller groups and projects that are doing important work, but don't have
the numbers to get their proposal into the top 10. The Wishlist is also
used by volunteer developers and other teams, who want to find projects to
work on that the community really wants.
Come help set the agenda.
If you want to see what the team has done in 2017, see the status report
from last month:
What you can do now:
*) Vote. This is the most important thing.
*) Spread the word. We really want people to find this, of course, and
we'll work on finding the best balance between spreading the news to
everyone and not being annoying, but please do help to spread the
information in your local community – Village Pump equivalents, IRC
channels, social media groups and so on.
*) Help translating the pages. We want the process to be as available as
possible for everyone. It's not every available if it's only in English.
*) If you want to get short updates through the notification system, you
can sign up for the Community Tech Newsletter:
I'm working on the database schema for Multi-Content-Revisions (MCR)
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Database_Schema> and I'd
like to get rid of the rev_sha1 field:
Maintaining revision hashes (the rev_sha1 field) is expensive, and becomes more
expensive with MCR. With multiple content objects per revision, we need to track
the hash for each slot, and then re-calculate the sha1 for each revision.
That's expensive especially in terms of bytes-per-database-row, which impacts
So, what do we need the rev_sha1 field for? As far as I know, nothing in core
uses it, and I'm not aware of any extension using it either. It seems to be used
primarily in offline analysis for detecting (manual) reverts by looking for
revisions with the same hash.
Is that reason enough for dragging all the hashes around the database with every
revision update? Or can we just compute the hashes on the fly for the offline
analysis? Computing hashes is slow since the content needs to be loaded first,
but it would only have to be done for pairs of revisions of the same page with
the same size, which should be a pretty good optimization.
Also, I believe Roan is currently looking for a better mechanism for tracking
all kinds of reverts directly.
So, can we drop rev_sha1?
Principal Platform Engineer
Gesellschaft zur Förderung Freien Wissens e.V.
Handling of usernames in imported edits in MediaWiki has long been weird
(T9240 was filed in 2006!).
If the local user doesn't exist, we get a strange row in the revision table
where rev_user_text refers to a valid name while rev_user is 0 which
typically indicates an IP edit. Someone can later create the name, but
rev_user remains 0, so depending on which field a tool looks at the
revision may or may not be considered to actually belong to the
If the local user does exist when the import is done, the edit is
attributed to that user regardless of whether it's actually the same user.
See T179246 for an example where imported edits got attributed to the
wrong account in pre-SUL times.
In Gerrit change 386625 I propose to change that.
- If revisions are imported using the "Upload XML data" method, it will
be required to fill in a new field to indicate the source of the edits,
which is intended to be interpreted as an interwiki prefix.
- If revisions are imported using the."Import from another wiki" method,
the specified source wiki will be used as the source.
- During the import, any usernames that don't exist locally (and can't
be auto-created via CentralAuth) will be imported as an
otherwise-invalid name, e.g. an edit by User:Example from source 'en' would
be imported as "en>Example".
- There will be a checkbox on Special:Import to specify whether the same
should be done for usernames that do exist locally (or can be created) or
whether those edits should be attributed to the existing/autocreated local
- On history pages, log pages, and the like, these usernames will be
displayed as interwiki links, much as might be generated by wikitext like "
[[:en:User:Example|en>Example]]". No parenthesized 'tool' links (talk,
block, and so on) will be generated for these rows.
- On WMF wikis, we'll run a maintenance script to clean up the existing
rows with valid usernames and rev_user = 0. The current plan there is to
attribute these edits to existing SUL users where possible and to prefix
them with a generic prefix otherwise, but we could as easily prefix them
- Unfortunately it's impossible to retroactively determine the actual
source of old imports automatically or to automatically do anything about
imports that were misattributed to a different local user in
- The same will be done for CentralAuth's global suppression blocks.
In this case, on WMF wikis we can safely point them all at Meta.
If you have comments on this proposal, please reply here or on
Background: The upcoming actor table changes require some change to the
handling of these imported names because we can't have separate attribution
to "Example as a non-registered user" and "Example as a registered user"
with the new schema. The options we've identified are:
1. This proposal, or something much like it.
2. All the existing rows with rev_user = 0 would have to be attributed
to the existing local user (if any), and in the future when a new user is
created any existing edits attributed to that name will be automatically
attributed to that new account.
3. All the existing rows with rev_user = 0 and an existing local user
would have to be re-attributed to different *valid* usernames, probably
randomly-generated in some manner, and in the future when a new user is
created any existing edits for that name would have to be similarly
4. Like #2, except the creation (including SUL auto-creation) of the
same-named account would not be allowed. Thus, an import before the local
name exists would forever block that name from being used for an actual
5. Some less consistent combination of the "all the existing rows" and
"when a new user is created" options from #2–4.
Of these options, this proposal seems like the best one.
: ">" was chosen rather than the more typical ":" because the former is
already invalid in all usernames (and page titles). While a colon is *now*
disallowed in new usernames, existing names created before that restriction
was added can continue to be used (and there are over 12000 such usernames
in WMF's SUL) and we decided it'd be better not to suddenly break them.
Brad Jorsch (Anomie)
Senior Software Engineer
The Technical Collaboration team proposes the creation of a developer
support channel focusing on newcomers, as part of our Onboarding New
Developer program. We are proposing to create a site based on Discourse
(starting with a pilot in discourse-mediawiki.wmflabs.org) and to point the
many existing scattered channels there.
This is an initial proposal for a pilot. If the pilot is successful, we
will move it production. For that to happen we still need to sync well with
Wikimedia Cloud Services, Operations and the Wikimedia technical community.
Please check https://www.mediawiki.org/wiki/Discourse and share your
Engineering Community Manager @ Wikimedia Foundation
(sorry for cross-posting ...)
we're really happy to announce that the new AdvancedSearch interface got
deployed as a beta feature to Mediawiki.org and test just 2 hours ago. 
AdvancedSearch enhances Special:Search through an advanced parameters form
and aims to make existing search options more visible and accessible for
The feature is a project by WMDE's Tech team and originates from the
Technical Wishes project. Many other people and teams helped making this
project happen, so stay tuned for the thank you section at the end of this
email :-) 
The Search has great options to perform advanced queries, but often even
experienced editors don't know about it - this is what we found out when we
were conducting a workshop series on advanced search in several cities in
Germany in 2016. Together with contributors from German Wikipedia we've
discussed their desired search queries, explained how the syntax works and
how keywords like "hastemplate", "filetype" or "intitle" can be used and
combined to get the desired results.
The idea for the AdvancedSearch feature results out of these workshops,
where we not only discussed search but also designed first mocks for an
advanced search interface. 
*The first version, more deployments and other next steps*AdvancedSearch
supports some of the special search options the WMF's search team has been
implemented in the last years. The way the interface works, users don't
have to know the syntax behind each search field, but they can learn about
it if they want to. The first version of the feature comes with a first
selection of advanced search options, e.g. including support for
"hastemplate" or "intitle". The WMF's search team has started to work on a
'deepcat' functionality to make sub category search happen. We plan to add
support for this in the future, too.
If all goes well, we plan to deploy the beta feature on German and Arabic
Wikipedia by Wednesday, Nov 29. 
In the next 2-3 months we'd love to invite everyone to test the new feature
on those wikis: Comments, thoughs, bug reports ... - any feedback is much
Please see https://www.mediawiki.org/wiki/Help:AdvancedSearch for how to
Deployments to other wikis can follow later.
*Thank you (finally!)*
This project was developed with the support from many different people from
the start, and this makes it very special to us: We would like to thank the
folks from the German Wikipedia community who participated in the workshops
and who were discussing and designing first ideas together with us. We
would like to thank all other people who gave valuable feedback on
Phabricator, onwiki and in real life. We would like to thank the Arabic
Wikipedia community for getting interested in AdvancedSearch, and for
offering to test this feature in an early stage. The credit for the
development work on AdvancedSearch goes to our FUN team, supported by the
team that usually works on Technical Wishes. Last but really not least we
would like to thank the great folks from the WMF's search team who have
supported this project from the start and who have done (and still do) the
necessary backend work that now got supported by the AdvancedSearch
Sorry for the long email, and thanks for reading :-)
Birgit (for the Technical Wishes team)
 https://phabricator.wikimedia.org/T180147 (Deployment ticket group 0)
 https://phabricator.wikimedia.org/T180128 (Deployment ticket arwiki and
Community Communications Manager
Software Development and Engineering
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.