Hello all,
The Technical Decision-Making Forum Retrospective team
<https://www.mediawiki.org/wiki/Technical_decision_making> invites you to
complete a survey about Wikimedia's technical decision-making processes.
While there will be more ways to participate, this is the first and most
important step in our data collection. It aims to gather information about
your experience, thoughts, and needs regarding the process of making
technical decisions across the Wikimedia technical spaces.
This survey will be used for gathering information about the process and
the needs around technical decision-making that touches our production
systems.
You can find the survey link here:
https://wikimediafoundation.limesurvey.net/885471?lang=en
Who should take this survey?
People who do technical work that relies on software maintained by the
Wikimedia Foundation (WMF) or affiliates. If you contribute code to
MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
that rely on WMF infrastructure, this survey is for you.
What is the deadline?
*August 7th, 2023 *
What will the Retrospective team do with the information?
The retrospective team will synthesize the collected data and publish an
anonymized analysis that will help leadership make decisions about the
future of the process.
We will collect anonymized information that we will analyze in two main
ways:
-
Sentiments based on demographic information: these will tell us whether
there are different needs and desires from different groups of people.
-
General needs and perceptions about decision-making in our technical
spaces: This will help us understand what kind of decisions happen in
the spaces, who is involved, and how to adjust our processes accordingly.
Is the survey the only way to participate?
The survey is the most important way for us to gather information because
it helps us gather input in a structured manner. But it will not be the
only way you can share your thoughts with us - we will have more
information soon about upcoming listening sessions where you can talk with
us live. In the meantime, you are always welcome to leave feedback on the
talk page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Where can I see more information?
There are several places where you can find more information about the
Technical Decision-Making Process Retrospective:
-
The original announcement about the retrospective from Tajh Taylor:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
-
The Technical Decision-Making Process general information page:
https://www.mediawiki.org/wiki/Technical_decision_making
-
The Technical Decision-Making Process Retrospective on MediaWiki:
https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…
-
Phabricator ticket: https://phabricator.wikimedia.org/T333235
How to contact the retrospective core team:
-
Write to the core team mailing list: tdf-retro-2023(a)lists.wikimedia.org
-
The Technical Decision-Making Process Retrospective on MediaWiki talk
page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Thank you,
Moriel, on behalf of the TDMP Retro Core Group
Core group:
-
Moriel Schottlender (chair)
-
Daniel Kinzler
-
Chris Danis
-
Kosta Harlan
-
Temilola Adeleye
--
Moriel Schottlender (she/her <https://pronoun.is/she>)
Principal Software Engineer
Wikimedia Foundation https://wikimediafoundation.org/
(If you don’t work with pagelinks table, feel free to ignore this message)
Hello,
Here is an update and reminder on the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
regarding normalization of links tables that was sent around a year ago.
As part of that work, soon the pl_namespace and pl_title columns of
pagelinks table will be dropped and you will need to use pl_target_id
joining with the linktarget table instead. This is basically identical to
the templatelinks normalization that happened a year ago.
Currently, MediaWiki writes to both data schemes of pagelinks for new rows
in all wikis except English Wikipedia and Wikimedia Commons (we will start
writing to these two wikis next week). We have started to backfill the data
with the new schema but it will take weeks to finish in large wikis.
So if you query this table directly or your tools do, You will need to
update them accordingly. I will write a reminder before dropping the old
columns once the data has been fully backfilled.
You can keep track of the general long-term work in T300222
<https://phabricator.wikimedia.org/T300222> and the specific work for
pagelinks in T299947 <https://phabricator.wikimedia.org/T299947>. You can
also read more on the reasoning in T222224
<https://phabricator.wikimedia.org/T222224> or the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
.
Thank you,
--
*Amir Sarabadani (he/him)*
Staff Database Architect
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi All,
Welcome to the monthly MediaWiki Insights email!
Enable more people to know MediaWiki and contribute effectively
In the last MW insights email
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/October_2…>
we shared more about our approach to helping people contribute effectively
to MediaWiki
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Contributor_reten…>.
A few interesting data points:
The number of contributors to MediaWiki core who have more than > 5 patches
continued to grow: We just hit for the first time the goal of 20% since the
start of the Foundation’s fiscal year in July, compared to the
July-November time period last year. This is exciting to see - now it’s
about keeping the momentum and continuing on that path.
Many thanks to all the people who have contributed to MediaWiki core!
The average and median time to first review for patches in MediaWiki core
decreased significantly in the period July 1st to Nov 30 compared to the
same time period one year earlier.
- Average time to first review dropped from previously 16.5 days to 4.5
days
- Median time to first review dropped from previously 1.2 days to 0.6
days
Many thanks to all the code reviewers of MediaWiki core patches!
Keep in mind that this data is only one data point. There are many factors
that play into the experience of contributors; a helpful comment may be
more relevant than a fast +1/-1, etc.
Over the past weeks, we have been spending some time with planning
initiatives to further support people in onboarding and contributing to
MediaWiki:
- We are preparing for a WMF internal MediaWiki code jam in December to
try out a few things and focus specifically on the needs of teams.
- One thing we wanted to test in practice at the code jam is the “MediaWiki
Quick Install <https://phabricator.wikimedia.org/T347347>” guide. This
has been a collaboration between the Tech Docs team and the MediaWiki
Platform team - you can find the latest version of this experiment here:
https://www.mediawiki.org/wiki/Local_development_quickstart
- We discussed a possible focus project in the next quarter on improving
first time MediaWiki (core) contributors’ experience. We’re exploring a few
simple, small ideas that we could implement/try out in the next quarter
(ticket follows!).
Project snapshot: Analysis of MediaWiki execution timings, fixing issues
with logging in on Mobile, progress on RESTBase deprecation and more!
Performance: Piotr and Timo conducted an analysis of MediaWiki execution
timings <https://phabricator.wikimedia.org/T350593> and identified areas
for improvement. One of the fixes promises a 50ms improvement
<https://phabricator.wikimedia.org/T351807>! Timo and Derick worked on
bagOStuff improvements <https://phabricator.wikimedia.org/T336004> (cache
layer), shipped on MW-1.42. This work aims to lower the barriers for
contributors by making interfaces leaner and more intuitive and is reducing
storage access cost from 10ms to ~1 ms. Thank you for your work!
More highlights:
MediaWikiIntegrationTestCase now automatically tracks what database tables
get touched during the integration test, removing the need for developers
to keep track (T342301 <https://phabricator.wikimedia.org/T342301>). Many
thanks to Daimona and others for their work on this!
Work towards PHP 8.2 support continues, with one helpful outcome being a
new DynamicPropertyTestHelper feature (T326466
<https://phabricator.wikimedia.org/T326466>). Many thanks to TK-999 and all
reviewers!
Gergö worked on solving a variety of problems with logging in on mobile
(see https://phabricator.wikimedia.org/T257852#9347008 and below). Many
thanks to Gergö and everyone who provided support!
RESTBase sunset: Wikifeeds now calls the Parsoid endpoint in MediaWiki core
rather than RESTBase. Many thanks to Yiannis and Daniel for their hard work
on making this happen! Cxserver is preparing a deployment to the same soon
<https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/977983/>
(thank
you, Language team!).
Upcoming:
There is an OutputTransform
<https://www.mediawiki.org/wiki/Parsoid/OutputTransform> pipeline that is
being introduced to replace ParserOutput::getText(). This pipeline
initially targets content that comes from the ParserCache before it is
rendered (as a 1:1 getText() equivalent ). The team is likely going to
introduce another layer of cacheability of this output so that we can store
richer canonical Parsoid content and use this pipeline to transform it for
final rendering. Many thanks to Isabelle, CScott and Daniel for this work
in progress (Gerrit:967449
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/967449>)!
As one puzzle piece of our product research efforts and platform design
explorations, Moriel and others have been working on mapping high level
essential user workflows such as edit and patrol against platform
components to explore workflow patterns and potential architectural
opportunities in the platform. One outcome of this is going to be to
describe the key challenges when trying to model our system. Many thanks to
Moriel for leading on this work, and Daniel, Timo, Subbu, James, Cindy,
Emanuele and Amir S for their support, great questions and ideas!
Up next: Presentations at Semantic MediaWikiCon
Semantic MediaWikiCon
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023#Program> is
coming up, virtual and in person from Dec 11-13. We shared about the
updates to the rdbms library in the last MW Insights email - if you want to
learn more about this work, check out Amir’s presentation at Semantic
MediaWikiCon
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Major_changes_on_i…>!
Subbu and C.Scott are also going to give their yearly update on the parser
unification work
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Updates_from_the_W…>,
Chris will be talking about Codex
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Codex,_the_Design_…>,
and Stef about automated testing for complex MediaWiki topologies
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Automated_Testing_…>.
Since the theme of this edition is MediaWiki in the age of AI, Mike will be
presenting on the recent experiences with the experimental Wikipedia
ChatGPT plugin
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/The_Wikipedia_Chat…>.
Keynote speaker of this years’ Semantic MediaWikiCon is Markus Krötsch
<https://www.korrekt.org/page/Short_biography>.
That’s the last insights email for 2023. The deployment train pauses for
the end of the year break, and so does the monthly MW Insights email!
We’ll be following up with a double-edition in January.
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>
Recently I completed initial work to enable commit-message-validator
[0] to work with GitLab CI and GitLab merge requests. For those who
are unaware, commit-message-validator is a linter tool designed to
enforce Wikimedia's commit message guidelines. [1]
Soon after the feature was available, James Forrester added it to the
test suite for abstract-wiki/wikifunctions/function-orchestrator [2]
and found the first issue with the integration that I had not
anticipated which he helpfully filed as T351253. [3]
In my estimation, the problem comes down to a question of whether we
should prioritize reading commit message footer information nicely in
GitLab's merge request interface where they are rendered as GitLab
flavored markdown data or not. James' team has developed a convention
of appending a backslash (\) after footer lines so that they render as
individual lines when processed as markdown. This in turn leads to
commit-message-validator rejecting some footers, most obviously "Bug:
Tnnnn" footers, for having unwanted characters (the trailing " \").
Reasonable people can disagree on the "best" solution here, but I
think it is likely that as a group we can reach consensus on what the
proper behavior of the commit-message-validator tool should be. The
most obvious options are:
* Change nothing in commit-message-validator and suggest folks live
with markdown rendering artifacts in GitLab merge request
descriptions.
* Change commit-message-validator to allow trailing " \" data for
commit message footers in GitLab repos.
* Change commit-message-validator to allow users (typically a CI
process) to configure allow/disallow of trailing " \" data for commit
message footers
Adding support for per-repo rules configuration would be a first for
commit-message-validator. Until now it has provided a single
opinionated ruleset based on interpretation of the commit message
guidelines. [2]
Folks who actually care about this minutia (how git commit message
footers look in and out of GitLab markdown rendering) are encouraged
to think carefully and provide their opinions and supporting data on
T351253. [3]
[0]: https://www.mediawiki.org/wiki/Commit-message-validator
[1]: https://www.mediawiki.org/wiki/Gerrit/Commit_message_guidelines
[2]: https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-orc…
[3]: https://phabricator.wikimedia.org/T351253
Bryan
--
Bryan Davis Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808
Hi there,
I'm User:Diskdance, and recently I'm developing a default gadget for Chinese Wikipedia enhancing MediaWiki's variant handling logic, and under certain circumstances a prompt is shown at page load asking for a user's preferred variant. Consider it as a conditional Cookie notice, and its English screenshot can be found at https://commons.wikimedia.org/wiki/File:VariantAlly-En.png.
Iknowthis can be very disruptive on UX, so I tend to be careful about its negative impact on page views. If the gadget can collect telemetry data about the prompt's display frequency and user interactions(using e.g. WikimediaEvents), I can know about its possible impact.
Is this possible? It would be much appreciated if anybody could provide assistance.
Best wishes,Diskdance
Hi all,
I have asked for +2 access to the PageTriage extension
<https://mediawiki.org/wiki/Extension:PageTriage> to be able to participate
in reviewing code contributions made to the repository :) Please share any
comments at https://phabricator.wikimedia.org/T351972
Regards,
Sohom Datta
---
Open-source contributor @Wikimedia, (and sometimes @Chromium)
Do your eyes glaze over when trying to make sense of Phabricator tasks
with a long and troubled history of patches? Do you struggle to find the
real people in the sea of automated bot comments?
I've made myself a user style to compact the Gerritbot comments on
Phabricator. You might like it too:
https://userstyles.world/style/13140/wikimedia-phabricator-reduce-gerritbot…
(See screenshots and installation instructions there.)
--
Bartosz Dziewoński
Hello. Something weird started yesterday. I heavily use the global
watchlist, and open links in many wikis a couple of times every day.
Something that never happened before, and happens dozens of times now, in
about 90% of the cases. If I work with wiki X and open some page on wiki Y,
I find myself logged out there. I can go back to X and press F5, and still
be logged in in X, which is the weird part, because I should be logged out
everywhere simultaneously from any reason. The only thing that helps for Y
is clicking Log in, then the page makes autorefresh to logged in state.
Some more people complained about this. I would create a Phabricator
ticket, but I have no idea how to reproduce this.
Igal (User:IKhitron)
Hello, all!
Starting today we are kicking off the process to shut down Grid Engine and
we want to share the timeline with you.
== Background ==
WMCS made the Grid Engine available as a backend engine for hosting tools
on Toolforge - our Platform as a Service(PaaS) offering.
An additional backend engine, Kubernetes, was also made available on
Toolforge.
Over time, maintaining and securing the grid has proven to be difficult and
making it harder to provide support to the community in other ways because
a lot of man-hours of maintenance work is spent on this.
This is mainly due to the fact that there has been no new Grid Engine
releases (bug fixes, security patches, or otherwise) since 2016.[0]
Maintenance work on the grid continued because it was widely popular with
the community and the Kubernetes offering didn't yet have many grid-like
features that contributors came to love.
Once the Kubernetes platform could handle many of the workloads, we started
the grid deprecation process by asking maintainers to migrate off the
grid.[1]
Over the past year, we've been reaching out to our tool maintainers and
working with them to migrate their tools off the Grid to Kubernetes.
We have reached out directly to all maintainers with their phabricator
ticket IDs.
The latest updates to Build Service[2] have addressed many of the issues
that prevented tool maintainers from migrating.
== Initial Timeline ==
The detailed grid shutdown timeline is available on wiki.[3] The important
dates have been copied below.
* 14th December, 2023: Any maintainer who has not responded on phabricator
will have tools shutdown and crontabs commented out. Please plan to migrate
or tell us your plans on phabricator before that date.
* 14th February, 2024: The grid is completely shut down. All tools are
stopped.
If you need further clarification or help migrating your tool, don't
hesitate to reach out to us on IRC, Telegram, Phabricator[4] or via any of
our support channels.[5]
Thank you.
[0]: https://techblog.wikimedia.org/2022/03/14/toolforge-and-grid-engine/
[1]:
https://wikitech.wikimedia.org/wiki/News/Toolforge_Grid_Engine_deprecation
[2]: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Build_Service
[3]:
https://wikitech.wikimedia.org/wiki/News/Toolforge_Grid_Engine_deprecation#…
[4]: https://phabricator.wikimedia.org/project/profile/6135/
[5]:
https://wikitech.wikimedia.org/wiki/Portal:Toolforge/About_Toolforge#Commun…
--
Seyram Komla Sapaty
Developer Advocate
Wikimedia Cloud Services