This is the weekly TechCom board review, usually in preparation for our
meeting, a few days late this week because I forgot. Apologies.
Activity since Monday 2020-10-06 on the following boards:
https://phabricator.wikimedia.org/tag/techcom/https://phabricator.wikimedia.org/tag/techcom-rfc/
Committee inbox:
- T263904 <https://phabricator.wikimedia.org/T263904>: Are traits part
of the stable interface?
- T239742 <https://phabricator.wikimedia.org/T239742>: "Should npm
packages maintained by Wikimedia be scoped or unscoped?"
- Both have been sitting in the inbox for a month or so, Tim first
pointed this out a few weeks ago but we haven't triaged them yet.
Committee board activity (none)
New RFCs (none)
Phase progression (none)
IRC meeting request (none)
Other RFC activity:
- T262946: <https://phabricator.wikimedia.org/T262946> Bump Firefox
version in basic support to 3.6 or newer
- Some agreement between Volker and Timo to keep the scope at just the
Firefox version bump and move this RFC forward. I think Volker
is welcome
to move this back to P4 unless I missed some other ongoing discussion?
- T259771: <https://phabricator.wikimedia.org/T259771> RFC: Drop support
for database upgrade older than two LTS releases
- Discussion about upgrading from 1.16 (this humble reader says
upgrading from a version that's 10 years old should not be a priority)
Hello,
This email contains updates for October 14, 2020. For the HTML version,
see: https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-10-14
Cheers,
Deb
------------------------
*= 2020-10-14 =*
== Callouts ==
* SRE: FYI, Datacenter switchback to eqiad happening on the 26th.
== Product ==
=== iOS native app ===
* Updates:
** Released minor bug fix version 6.7.2
*** Contains widget bug fixes (mainly widget midnight update smearing for
[[phab:T264881|T264881]])
** In development on [[phab:tag/ios-app-bonefish-on-a-balloon/|6.7.3]]
*** Article as a Living Document experiment
=== Structured Data ===
* Updates:
** working towards beta release of mediasearch
=== Abstract Wikipedia ===
* Updates:
** Voting in first round of
[[metawiki:Abstract_Wikipedia/Wiki_of_functions_naming_contest|the
community naming contest]] for what to call the central wiki of functions
has now wrapped up.
** Landed initial work migrating the Vue editing interface into a hybrid
view/edit model (JS-only until we do server-side rendering).
** Continuing work on using ZType data to enforce structure when editing
ZObjects.
== Technology ==
=== Site Reliability Engineering ===
* Updates:
** Datacenter switchback to eqiad happening on the 26th.
--
deb tankersley (she/her)
sr program manager, engineering
Wikimedia Foundation
It has been a while since we had Thank you Tuesdays but this one needs a
shout out.
Thank you Manuel for epic work of cleaning up more than one thousand schema
drifts in s3 (small wikis) from missing indexes to uncleaned fields and
much more:
https://phabricator.wikimedia.org/T260111
Best
--
Amir (he/him)
Hello,
The wikis are still on 1.36.0-wmf.10 as we are still working on a user
authentication issue from last week that prevented us from rolling
1.36.0-wmf.11 [T264370].
We are effectively cancelling this week deployment of 1.36.0-wmf.12. We
start rolling 1.36.0-wmf.11 on Tuesday and reasses from there.
The change scheduled for 1.36.0-wmf.12 would thus be deployed as part of
the next train: 1.36.0-wmf.13.
Recap:
1.36.0-wmf.10, fully deployed Thursday, Sep 24th
1.36.0-wmf.11, rolled back Thursday, Oct 1st
1.36.0-wmf.12, scheduled this week and canceled hereby
We aim at deploying wmf.11 on Tuesday October 13rd and wmf.13 in the
days that follow.
[T264370] User authentication security issue (Oct 1)
https://phabricator.wikimedia.org/T264370
--
Antoine "hashar" Musso & cie
Release Engineering
Dear all,
We’re really happy to announce the second edition of the Coolest Tool Award
<https://meta.wikimedia.org/wiki/Coolest_Tool_Award>!
Tools play an essential role at Wikimedia, and so do the many volunteer
developers who experiment with new ideas, develop & maintain local & global
solutions and enhance the experience for Wikimedia communities.
There are incredible many great tools out there. It’s time to celebrate
this & to make the great work volunteer developers do more visible to
everyone :-)
The Coolest Tool Award ceremony will take place virtually this year, given
the current circumstances around events and travel. We will provide more
details soon about the specific logistics and dates.
The award is organized & selected by the *Coolest Tool Academy 2020*
<https://meta.wikimedia.org/wiki/Coolest_Tool_Award#Coolest_Tool_Award_2020>.
We plan to recognize the greatest tools in a variety of categories, for
examples you can look at last year’s categories
<https://meta.wikimedia.org/wiki/Coolest_Tool_Award/2019>.
As no one can possibly know all the cool tools out there, we’re looking for
some help & inspiration: Please point us to the tools that you think are
great - out of any reason you can think of!
Please use this form:
https://docs.google.com/forms/d/e/1FAIpQLSf5ZmXXamn9sRsagEiiZcUZDn1Ga0sF3Xm…
to recommend tools *by October 14, 2020*. You can nominate as many tools as
you want by filling out the form multiple times.
This survey will be conducted via a third-party service, which may subject
it to additional terms. For more information on privacy and data-handling,
see the survey privacy statement:
https://foundation.wikimedia.org/wiki/Coolest_Tool_Award_2020_Survey_Privac…
Thank you very much for your ideas & recommendation(s)!
We will continue to spread the word over the next 1-2 days, but if you get
the chance, please feel welcome to share this information with others too!
Thanks :-)
Joaquin, for the Coolest Tool Academy 2020
--
Joaquin Oltra Hernandez
Developer Advocate - Wikimedia Foundation
Hello,
It has been a while since I gave an update on the state of abstracting
schema and schema changes in mediawiki
<https://phabricator.wikimedia.org/T191231>. So here's a really long one.
So far around half of the mediawiki core tables have been migrated to
abstract schema (plus lots of extensions lika Wikibase, Babel, Linter,
BetaFeatures, etc.). Special thanks to Tgr for reviewing most of the
patches and Sam Reed and James Forrester for doing the extensions.
With the growing number of schemas being abstracted, this is going to
affect your development if you work on schema and schema changes in core or
any of the extensions. So If you do, please read Manual:Schema changes
<https://www.mediawiki.org/wiki/Manual:Schema_changes> in mediawiki.org
You might think that abstraction is just migrating SQL to JSON but it's
much more, we are making the database schema of mediawiki much more
consistent, We are basically addressing several long standing issues like
T164898 <https://phabricator.wikimedia.org/T164898> and T42626
<https://phabricator.wikimedia.org/T42626> as well.
*Improvement aspects*
First aspect is drifts between different DBMSes. Sqlite schema is being
produced by regex replacement (this code
<https://github.com/wikimedia/mediawiki/blob/c477bcf2c5c482d3189ec3579c5dee4…>)
which is less than great but at least it comes from one place. For
Postgres, its schema and MySQL/Sqlite has drifted so drastically, that
fixing it so far required 76 schema changes fixing issues ranging from
missing indexes to missing PKs, extra AUTO_INCREMENT where it shouldn't be,
missing DEFAULT values, drifting data types and much more. You can follow
the fixes of Postgres in here <https://phabricator.wikimedia.org/T164898>.
The second aspect is the inconsistency in the schema itself. How do we
model strings? VARCHAR? VARBINARY()? VARCHAR() BINARY? (all three are
different things). You'd be surprised how inconsistent our MySQL is. So
far, we are migrating all VARCHAR() BINARY fields to VARBINARY() (so far
ten schema changes).
Another inconsistency is timestamps. In MySQL, around half of them are
BINARY(14) and the other half VARBINARY(14) (but in Postgres all are
TIMESTAMPTZ), there is even a ticket
<https://phabricator.wikimedia.org/T42626> about it. It makes sense to
migrate all of them to BINARY(14) but not all timestamps are 14 characters,
e.g. expiry fields accept "infinity" as value and it's a valid timestamp in
Postgres ¯\_(ツ)_/¯ When you turn an expiry field to BINARY(14), "infinity"
becomes " infinity" and as the result mediawiki doesn't recognize it
as infinity ("infinity" != " infinity"). There are several ways to
move forward handling expiry fields, you can follow the discussion in this
gerrit patch <https://gerrit.wikimedia.org/r/c/mediawiki/core/+/631936>.
Another fun aspect: Booleans. MySQL doesn't have boolean, it translates
them to TINYINT(1) but other DBMSes don't have TINYINT, they have SMALLINT
and BOOL though (and we mostly use SMALLINT for them), we decided to go
with SMALLINT for these cases (which is different than what Doctrine DBAL
does, it uses BOOL, so we introduced our own custom type for booleans).
Last but not least: ENUMs. MySQL and Postgres support that but Sqlite
doesn't. Doctrine DBAL doesn't support ENUM at all (as it's an
anti-pattern) while core has eight fields that are ENUM. There's an RFC to
discourage using it in general. Feel free to comment on it.
<https://phabricator.wikimedia.org/T119173>
A miscellaneous note: The directories that hold the archive of sql patches
of schema change are exploding (some of the sql patches are even orphan but
we can't find them because there are so many of them). So I started a RFC
to clean that mess up: Drop support for database upgrade older than two LTS
releases <https://phabricator.wikimedia.org/T259771>
*What's next?*
- We continue to migrate more tables, hopefully we will get two third
of them by the end of the year (fingers crossed). You can follow the
progress in its ticket <https://phabricator.wikimedia.org/T230428>.
- We will support abstract schema changes, really soon, like in a
couple of weeks. Basically you start a json file containing snapshots of
before and after of a table and then a maintenance script will produce the
needed sql patches for you for different schemas. This will increase the
developer productivity drastically, since 1- Schema change sql files become
more reliable and consistent and less prone to errors like adding the
column to the wrong table in some DBMSes
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/328377/22/maintenance/mss…>
2- You don't need to know Postgres or Sqlite peculiarities to make patches
against it. The reason you need to proved the whole table for adding like
an index is that sqlite doesn't support all types of ALTER TABLES, you have
to create temporary tables, move the data around and then rename and drop
in some cases, producing beautiful sql patches like this
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/630341/1/maintenance/sqli…>
- We work on improving the script that reports drifts between core and
our production. I have already made it work with abstract schemas as well,
I will continue working on it to report even smaller differences like field
size, type, etc. Which is now much easier thanks to the abstract schema.
Slowly we will migrate that script to production (as part of SRE scripts)
and we will do automated reports and automated drift fixes (on small
wikis). You can follow the work on this ticket.
<https://phabricator.wikimedia.org/T104459> So far, this script is being
run manually but found more than thousand and thousands of drifts across
the cluster and all are fixed thanks to our amazing DBAs (look at the
ticket)
*How can I help?*
Glad you asked! You can follow the abstract-schema
<https://gerrit.wikimedia.org/r/q/hashtag:%22abstract-schema%22+(status:open…>
hashtag in gerrit and review patches or you can make them yourself (get
yourself familiar using the documentations
<https://www.mediawiki.org/wiki/Manual:Schema_changes>). If you maintain an
extension feel free to migrate its table(s) (and track it in this ticket
<https://phabricator.wikimedia.org/T259374>). If you use Postgres for
mediawiki, please help us with testing the improvements for Postgres.
Thanks for reading this long email!
Best
--
Amir (he/him)
I'm hoping to gather a bit of information on the current use of
MediaWiki-Vagrant by folks in the Wikimedia movement. I have started a
slowvote poll in Phabricator at
<https://phabricator.wikimedia.org/V24> that asks one question: "Do
you personally use MediaWiki-Vagrant to develop or test software for
the benefit of the Wikimedia movement?"
I would really appreciate folks taking a couple minutes to click
through to the poll and selecting one of the possible responses:
* Yes, I use it daily
* Yes, I use it occasionally
* No, I used to use it but have switched to another dev environment
* No, I tried it but it never worked well
* No, I have never used it
Data collected in the poll will be used to help me and others decide
if it is worth putting in additional work to revitalize the group
maintaining and improving MediaWiki-Vagrant.
Bryan
--
Bryan Davis Technical Engagement Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808