Hello,
we are planning to change how Cloud VPS instances and Toolforge tools contact
WMF-hosted wikis, in particular the source IP address for the network connection.
The new IP address that wikis will see is 185.15.56.1.
The change is scheduled to go live on 2021-02-08.
More detailed information in wikitech:
https://wikitech.wikimedia.org/wiki/News/CloudVPS_NAT_wikis
If you are a Cloud VPS user or Toolforge developer, check your tools after that
date to make sure they are properly running. If you detect a block, a rate-limit
or similar, please let us know.
If you are a WMF SRE or engineer involved with the wikis, be informed that this
address could generate a significant traffic volume, perhaps about 30%-40% total
wiki edits. We are trying to smooth the change as much as possible, so please
send your feedback if you think there is something we didn't account for yet.
Thanks, best regards.
--
Arturo Borrero Gonzalez
SRE / Wikimedia Cloud Services
Wikimedia Foundation
Hello,
There has been a lot of progress in abstract schema and abstract schema
changes initiative since last time
<https://lists.wikimedia.org/pipermail/wikitech-l/2020-October/093954.html>
I gave an update on it. So here's another one.
*Abstract Schema*
So far, more than 90% (51 out 56) of tables of mediawiki core are now
migrated to abstract schema.
This means much smaller schema drifts between MySQL and Postgres. We have
done more than 250 schema changes in Postgres to fix these drifts.
Including 56 index rename, 66 data type change, setting default to 43
fields and changing nullability of 29 fields. To compare, that's more
schema changes done on Postgres from 2014 until 2020. Once we have migrated
all tables, we can close this four-year old ticket
<https://phabricator.wikimedia.org/T164898>.
Similar improvement has happened on standardizing timestamp fields in MySQL
<https://phabricator.wikimedia.org/T42626>, once all tables are migrated,
we can call this eight-year old ticket done too.
One nice thing about having an abstract schema is that you can generate
documentation automatically, This page is completely made
<https://www.mediawiki.org/w/index.php?title=User:Ladsgroup/Test&oldid=43795…>
automatically from tables.json. We can make it generated in
doc.wikimedia.org on every merge. And also we can make the database layout
diagram
<https://www.mediawiki.org/w/index.php?title=Manual:Database_layout/diagram&…>
created automatically.
Another nice thing. When you have an abstract schema, you can easily write
tests and enforce database conventions. For example, you can write a test
to make sure all tables have exactly five columns (because five is your
lucky number). We haven't written such a test but now there's a test that
enforces a uniform prefix for columns and indexes of tables in core
<https://phabricator.wikimedia.org/T270033>. We are currently fixing its
violations to standardize our schema even more.
I'm planning to make reporting on drifts between the abstract schema and
our production completely automated and make it accessible to DBAs for
further investigations which is now much easier thanks to abstract schema.
You can follow the progress of that work in this ticket.
<https://phabricator.wikimedia.org/T104459>
*Abstract Schema Changes*
Now we have a new maintenance script, it produces schema change sql files
(aka ALTER TABLE files) based on snapshot of abstract schema of before and
after of a table. Here's an example of an index rename.
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/651176> It would make
creating schema change patches much easier (a little bit of work but you
don't need to know internals of Postgres anymore, it's also less prone to
mistakes)
With approval of RFC to drop support of upgrading from versions older than
two LTS releases, we can now drop hundreds and hundreds of sql files. It
would give us room to breath and audit our sql files to find orphan ones
and improve abstract schema change work. That is currently blocked on this
patch landing. <https://gerrit.wikimedia.org/r/c/mediawiki/core/+/648576>
We will work on reshaping the schema changes in general since its current
checks system is less than optimal, its tests are not very updated and so
much more to do.
*What can we do?*
Glad you asked :D The biggest request I have from people is to migrate
their extensions to abstract schema. There's a list of WMF-deployed
extensions that their schema has not migrated yet
<https://phabricator.wikimedia.org/T261912>. This is doubly important as we
want to build a reporting system for drifts in production and it's not
possible to report these drifts for extensions that their schema has not
migrated yet. So if you or your team maintain an extension from that list,
prioritize migrating that please. Reedy wrote a great script
<https://github.com/Ladsgroup/db-analyzor-tools/blob/master/db_abstractor.py>
that takes a sql file and produces its equivalent abstract schema and it
gives you a good starting point (PR is welcome!). Feel free to add me as a
reviewer to patches of migrating extensions to abstract schema.
Another thing is that if you use postgres for mediawiki, you help testing
our postgres schema by trying master (make sure to take a backup first) and
see if everything is alright.
*Thank you!*
I would really like to thank Ammarpad for migrating lots of tables of core
to abstract schema and handling all sorts of edge cases and doing most of
the work of using uniform prefix tests and fixes. Thanks to James Forrester
for reviewing lots of patches. Thanks Reedy for the script and also
abstracting lots of tables in extensions and also Tgr for helping in
reviews and getting the project going. Also a big thank you to DBAs for doing
a lot more schema changes in production
<https://phabricator.wikimedia.org/tag/blocked-on-schema-change/>. You rock!
An apology is also warranted for breaking update.php on master twice
(caused by yours truly).
Until next update!
--
Amir (he/him)
(now sharing beyond the WMF tech department)
Dear colleagues and MediaWiki contributors,
On behalf of the Platform Engineering Team, I am delighted to invite you to the
MediaWiki Authority interface[1][2] evaluation during the Platform Engineering
Office Hours[3] on Feb 04 1700 UTC. Dress Formal. Or not ;)
Before we commit to using Authority for permission checking throughout the
codebase, we want to make sure that we didn't miss anything. So if you are
working on code that needs to check user permissions, please join the PET office
hour and give us your feedback!
During the meeting, we will present the new Authority interface, which defines a
standard for checking user permissions, blocks, throttles, etc, and allows us to
easily make the relevant context for permission checks available where it is
needed.
We are going to explain the design and demonstrate its application in various
areas using exploratorypatches. We would love to hear your opinion on the
approach we’re taking. Ifyou contribute to the MediaWiki or extension codebase,
you will likely have to use the new interface if it’s accepted, and this is your
opportunity to raise concerns and objections before the interface is finalized.
The meeting during Platform Engineering Office hours will loosely correspond to
the second step of the brand new Technical Decision Process[4] for which me made
a"Decision Statement Overview"[5]. After the initial meeting, you will have two
weeks for feedback on the ticket[2]. At the end of the two week period we may
schedule a followup meeting, depending on the feedback we receive.
Cheers. PET.
1.
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/refs/heads/…
2.
https://phabricator.wikimedia.org/T231930
3.
https://meet.google.com/pjo-xtxv-oea
4.
https://www.mediawiki.org/wiki/Technical_Decision_Making_Process#2_Technica…
5. https://docs.google.com/document/d/1RT3mWt57RkGJdeV5kVH_eoVOBu-97w7sYheepKt…
--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation
Hello,
We need to restart mariadb daemon on our wikitech master (
https://phabricator.wikimedia.org/T272388).
When: Thursday 28th January at 09:00AM UTC
Impact: wikitech will not be available (neither reads nor writes) for 2-5
minutes.
Sorry for any inconvenience.
Manuel.
Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2021.01. This bundle is compatible with MediaWiki 1.34 or above
and requires PHP 7.2 or above.
Next MLEB is expected to be released in 3 months. If there are very
important bug fixes, we will do an intermediate release. Please give
us your feedback at
[[Talk:MLEB|https://www.mediawiki.org/wiki/Talk:MLEB]].
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2021.01.tar…
* sha256sum: 4360572704369e5c1f02ab9df831dd0a6258cbcbae28c61c3c4a551347000b00
* Signature: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2021.01.tar…
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://phabricator.wikimedia.org/
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Babel, cldr, CleanChanges and LocalisationUpdate ==
* Localisation and maintenance updates only.
== Translate ==
* Add script to find unsynchronized definitions ({{Gerrit|655860}})
* Various improvements to the transaction export.php script
* Add a script to find and delete equal translations ({{Gerrit|652543}})
* GoogleTranslateWebService: Add Chinese codes to code map ({{Gerrit|649652}})
* Convert Special:ManageMessageGroups to OOUI ({{Gerrit|647670}})
* Insertables have been moved to the
src/TranslatorInterface/Insertable folder ({{Gerrit|644460}})
* Stop hiding page heading on pages other than Special:Translate
({{Gerrit|644514}})
* Drop non-array based configuration support for Insertables ({{Gerrit|644464}})
* Remove FCFontFinder ({{Gerrit|641189}})
* Bump group loader cache version to re-trigger re-caching of groups
({{Gerrit|644843}})
* Remove Gettext post processing support from export.php script
({{Gerrit|641385}})
== UniversalLanguageSelector ==
* Code refactoring & performance improvements
* Improve handling of opening links to new tab/window for compact
language links ({{Gerrit|639474}})
* Allow skins to register their own button and disable compact
language links ({{Phabricator|T264824}})
=== Fonts ===
* Add Boyo Gagrai for the Ho language in the Warang Citi script
({{Phabricator|T233301}})
--
Kartik Mistry | કાર્તિક મિસ્ત્રી
Hello,
This email contains updates for January 27, 2021
<https://www.mediawiki.org/wiki/Scrum_of_scrums/2021-01-27>.
Cheers,
Deb
Callouts
- Search is blocked by [Analytics] merge of the side-output schema and
wikimedia-event utilities https://phabricator.wikimedia.org/T270371 /
https://phabricator.wikimedia.org/T269619
No updates
- Community Tech, Anti-Harassment Tools, Editing, iOS native app,
Android native app
*No notes provided*
- Growth, Parsing, Language, Inuka, Cloud Services, Fundraising Tech,
Platform, Quality and Test Engineering, Security
ProductWeb
- Updates:
- Page Previews is now requesting larger thumbnails to fix blurry
thumbnails on low-DPR screens:
https://phabricator.wikimedia.org/T272169 (see patch:
https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Popups/+/658493
)
Product Infrastructure
- Updates:
- Client-side error logging was enabled on enwiki:
https://phabricator.wikimedia.org/T255585
Structured Data
- Updates:
- Preparing for security review or MediaSearch UI, which will be
moved into a separate extension
- Focus on tackling remaining blockers for a wider release of
MediaSearch
- Figuring out how to wire up the MediaSearch elastic queries into a
Learning To Rank machine learning model
Abstract Wikipedia
- Updates:
- Vue refactoring to align our code with emerging standards
continues; thanks to Vue team!
- Thanks to Daniel for advice on how our code may want to change to
be closer to Platform future plans, and forthcoming meetings to explore
further.
- Still working on programmatic type management.
- Community contest to select the logo concept for Wikifunctions is
currently running; ideas welcome!
Library
- Updates:
- Starting work on setting up an exception logging service (Glitchtip)
- Added a new partner in the Library Bundle
- Starting a bug fix that will prevent being spammed by The Wikipedia
Library Extension
Vue.js
- Updates:
- Working on ES6 minification for ResourceLoader.
- [Probably other updates James doesn't know about.]
TechnologyAnalytics
- Updates:
- The machines for our backup Hadoop cluster are racked, we are
setting them up, testing them, and moving backup data to them.
After that,
we'll be able to move forward with cluster migration to BigTop.
- We continue with the EventLogging legacy schema migration to
EventPlatform.
- Working on calculation of session length metric.
- We improved HDFS data security through better conventions on user
groups and permits.
- We're finalizing the corrections to canonical pageview dumps.
Engineering ProductivityPerformance
- Updates:
- Kobiton has provided disappointing service for our mobile device
lab, we're going to do a trial with Bitbar
- Published two short videos on officewiki:
- Why web performance matters at Wikimedia https://w.wiki/w7z
- The role of the Performance Team https://w.wiki/w7$
- Speaking tomorrow in French and English at We Love Speed:
https://www.welovespeed.com/en/2020/
Release Engineering
- Updates:
- [All] Deployments/Covid-19
https://wikitech.wikimedia.org/wiki/Deployments/Covid-19
- Train Health
- Last week: 1.36.0-wmf.27 phab:T271341
<https://phabricator.wikimedia.org/T271341>
- This week: 1.36.0-wmf.28 phab:T271342
<https://phabricator.wikimedia.org/T271342>
- Next week: 1.36.0-wmf.29 phab:T271342
<https://phabricator.wikimedia.org/T271342>
Search Platform
- Blocked by:
- [Analytics] merge of the side-output schema and wikimedia-event
utilities https://phabricator.wikimedia.org/T270371 /
https://phabricator.wikimedia.org/T269619
- Blocking:
- N/A
- Updates:
- Failing HTTP check on WDQS servers after latest deployment -
https://phabricator.wikimedia.org/T272713
- SPARQL-Query shows entries, which should be filter out; number of
entries in result set might change when executed repeatedly (possible
caching/indexing problem) - https://phabricator.wikimedia.org/T267175
- Extract a list of the 200 most viewed black historical figures from
WDQS -https://phabricator.wikimedia.org/T272447
Site Reliability Engineering
- Updates:
- similar-users, linkrecommendation services exposing to internal
services being finalized this week.
Cross-cutting
- Updates:
- No significant movement on PHP 8.0 this week.
- LibraryUpgrader is bumping all reports to eslint-config-wikimedia
0.18.0, which has proper JSON linting support
- There are some speed issues on very long JSON files, so we
worked with upstream to avoid this, and have released 0.18.1
with said fix
just now.
- If your repo doesn't get a patch automatically, it's probably
because LibraryUpgrader errors on your repo. We'll fix these
manually (or
file Phab tasks), but if you want to do it ahead of us, we
won't complain.
You can see the dashboard of such repos here:
https://libraryupgrader2.wmcloud.org/errors?branch=master
--
deb tankersley (she/her)
senior program manager, engineering
Wikimedia Foundation
Hey folks,
tl;dr: We're splitting PHP & HTTP containers in the MediaWiki-Docker[0]
development environment. If you're not currently using
MediaWiki-Docker, you can safely ignore this message.
---
We're planning to merge a change to the MediaWiki-Docker environment to
split PHP-FPM into a separate container from Apache.[1]
This should improve build efficiency for these images, and remove the
need to duplicate so much stuff (Apache, etc.) between versions of PHP.
It also unblocks support for PHP 7.3/7.4, and upgrades XDebug to the
3.x series.
What you'll need to change in existing setups:
In docker-compose-override.yaml:
* Linux users should specify a MW_DOCKER_UID & MW_DOCKER_GID for all
containers.
In .env:
* Set XDEBUG_ENABLE=true and XHPROF_ENABLE=true if you want the
corresponding extensions turned on.
* If you have an XDEBUG_CONFIG set, it may need updated to reflect
new configuration value names.[2]
Finally, I'd like to get it out the door this week, since it unblocks a
number of requested improvements, but feedback is of course welcome in
Gerrit.[1]
[0]. https://www.mediawiki.org/wiki/MediaWiki-Docker
[1]. https://gerrit.wikimedia.org/r/c/mediawiki/core/+/630988
[2]. https://xdebug.org/docs/upgrade_guide
Thanks!
--
Brennen Bearnes
Release Engineering
Wikimedia Foundation
Hello all!
This is the final TechCom digest. With the new Technical Decision Making Process
<https://www.mediawiki.org/wiki/Technical_Decision_Making_Process> in place, we
are spinning down the RFC process and shutting down the committee. A big Thank
You to all committee members past and present for their time and dedication!
On a closing note, two RFCs have been approved after Last Call, both of which
I'm personally very happy to see:
*Stable interface policy amendment <https://phabricator.wikimedia.org/T268326>*:
The policy was amended to include a definition of the "MediaWiki Ecosystem" of
extensions to be considered when deprecating obsolete code. The deprecation
process was overhauled to allow for a clear timeline from soft deprecation via
hard deprecation to removal.
*Drop support for upgrading from old releases (pre 1.31)
<https://phabricator.wikimedia.org/T259771>*: this frees up to remove about a
thousand or so database patch files only needed for upgrading from very old
systems. Upgrading from old versions of MediaWiki will still be possible, but
have to be performed in multiple steps.
--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation
The minutes from TechCom's triage meeting on 13 January 2020.
Present: Dan A, Daniel K, Tim S., Tim T.
RFC: Introduce PageIdentity [Approved]
-
https://phabricator.wikimedia.org/T208776
-
Last Call is closed, approved.
RFC: Normalize MediaWiki link tables
-
https://phabricator.wikimedia.org/T222224
-
Amir shows that the enwiki-pagelinks table has surpassed the size of the
revisions table.
-
Jaime shows that (post-MCR compression) the revision tables have gotten
considerably smaller, and that on Commons the image/oldimage tables are
even bigger than pagelinks and pose a bigger operational risk (since link
data can be regenerated). He recommends executing the oldimage migration,
which was approved in 2017 as part of RFC T28741
<https://phabricator.wikimedia.org/T28741>.
RFC: Drop support for older database upgrades [Last Call]
-
https://phabricator.wikimedia.org/T259771
-
Remains on Last Call until next week. Are people quietly excited?
RFC: Stable interface policy, Nov 2020 amendment [Last Call]
-
https://phabricator.wikimedia.org/T268326
-
Remains on Last Call for one more week.
Next week IRC office hours
No IRC discussion scheduled for next week.
You can also find our meeting minutes at
https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes
-- Timo