[Re-posting with fixed links. Thanks for pointing this out Cormac!]
This is the weekly TechCom board review. Remember that there is no meeting on
Wednesday, any discussion should happen via email. For individual RFCs, please
keep discussion to the Phabricator tickets.
Activity since Monday 2020-10-26 on the following boards:
https://phabricator.wiki09media.org/tag/techcom/
<https://phabricator.wikimedia.org/tag/techcom/>
https://phabricator.wikimedia.org/tag/techcom-rfc/
Committee board activity:
*
T175745 <https://phabricator.wikimedia.org/T175745> *"overwrite edits when
conflicting with self"* has once again come up while working on EditPage.
There seems to no longer be any reason for this behavior. I think it does
more harm then good. We should just remove it.
RFCs:
Phase progression:
* T266866 <https://phabricator.wikimedia.org/T266866> *"Bump basic supported
browsers (grade C) to require TLS 1.2"*: newly filed, lively discussion.
Phase 1 for now.
<https://phabricator.wikimedia.org/T262946>
*
T263841 <https://phabricator.wikimedia.org/T263841>*"Expand API title
generator to support other generated data"*: dropped back to phase 2 because
resourcing is unclear.
* T262946 <https://phabricator.wikimedia.org/T262946> *"Bump Firefox version
in basic support to 3.6 or newer"*: last call ending on Wednesday, November
4. Some comments, no objections.
Other RFC activity:
* T250406 <https://phabricator.wikimedia.org/T250406> *"Hybrid extension
management"*: Asked for clarification expectations for WMF to publish
extensions to packagist. Resourcing is being discussed in the platform team.
Cheers,
Daniel
I've been contributing to Wikipedia since 2001 and to
Wikimedia Commons since 2005. I'm not new to this.
Wikimedia Commons now has millions of images and thousands
of categories. But it's really hard to find anything.
Today I was looking for photos of buildings that have
an exterior lantern on their corner. This is common
in the countryside, where there are no streetlights
and no other buildings nearby. You put a lantern on
the outside of your house, and you put it on the
corner to cover two sides with one lamp.
So there are categories for buildings and for houses.
And by country and by county. And by building
material. And there is a category for exterior
lanterns, but none of the photos there are really
corner-mounted.
When I arrive at Commons, there is a search box and so
I type "exterior lantern corner", but all it does is to
search text content and descriptions. So I get lots of
hits in scanned documents, which is not what I wanted.
I can limit my search to the "File" namespace, but
that includes both JPEG photos and PDF documents.
Here are indeed some corner lamps, but not the rural
kind I wanted,
https://commons.wikimedia.org/wiki/File:Tinghuset_(lanterne).jpghttps://commons.wikimedia.org/wiki/File:Fanal_públic_LED.JPG
Here is what I search, but the lamp (in the upper left)
is too small to be useful in this photo,
https://commons.wikimedia.org/wiki/File:Vårdinge_församlingshem.JPG
This is where I want to find more images like these,
not in the same category, just similar photos.
Surely, such a search function could be implemented
in 2020, in addition to our current text search?
Hasn't anybody done this already?
--
Lars Aronsson (lars(a)aronsson.se)
Linköping
The 2021 Community Wishlist Survey[1] is now open!
This survey is the process where communities decide what the Community
Tech[2] team should work on over the next year. We encourage everyone to
submit proposals until the deadline on 30 November, or comment on other
proposals to help make them better. The communities will vote on the
proposals between 8 December and 21 December.
The Community Tech team is focused on tools for experienced Wikimedia
editors. You can write proposals in any language, and we will translate
them for you.
Thank you, and we look forward to seeing your proposals!
P.S. If the pages are not fully translated into your language, visit a
dedicated page[3], be bold, and add the translations!
[1]
https://meta.wikimedia.org/wiki/Special:MyLanguage/Community_Wishlist_Surve…
[2] https://meta.wikimedia.org/wiki/Community_Tech
[3]
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2021/Translation_…
Kind regards,
Szymon Grabarczuk (he/him)
Community Relations Specialist
Wikimedia Foundation <https://wikimediafoundation.org/>
This was brought up in a previous thread (link here
<https://lists.wikimedia.org/pipermail/wikitech-l/2020-October/093935.html>),
but the aggregated hourly view dumps haven't been published since
2020-09-24 (see here
<https://dumps.wikimedia.org/other/pagecounts-ez/merged/2020/>, also
mirrored here
<http://ftp.acc.umu.se/mirror/wikimedia.org/other/pagecounts-ez/merged/2020/>).
The response to the previous thread by Dan suggested that the new data
would be available in a week, but it's already a month past that expected
deadline. Are there any updates on the status of that new dump, any new
estimates of when it would become available? I would also suggest posting
information about the pending change and new system to the information page
(at https://dumps.wikimedia.org/other/pagecounts-ez/) -- from reading that
page, there is no indication that data delivery has stopped or that a new
pipeline will be available shortly.
Thanks for any information,
Michael
--
*Michael Tartre*
Senior Machine Learning Engineer
michael(a)predata.com
t: +1 415 857 0967
1 Liberty Plaza
New York, NY 10006
Hi All,
Over the past months I've been looking at how we make technical decisions.
Wikimedia has significantly grown as a technical movement over the years
and similar to how we scaled our technical systems we need to scale our
human systems. I've proposed an evolution of the existing TechCom process.
This is based on the past couple years that I've been involved in the
facilitation of TechCom and research into decision making frameworks. I'm
looking for your feedback over the next three weeks (until 5 November) to
help me improve the proposed process.
Key objectives of this evolved process is:
- Make the process more inclusive by shifting to representation by
teams/groups instead of individuals
- Have clear timelines for when a decision will be made
- Being clear upfront about what stakeholders will be engaged
- Developing a clear lifecycle of a decision
This plan also introduces:
- Technical Decision Forum which will be composed of representatives of
teams from Product and Tech, WMDE and independent +2 contributors.
- Templates for decision statement overviews and decision records
Please see the draft process here: <
https://www.mediawiki.org/wiki/Proposal_for_a_Technical_Decision_Making_Pro…
>
If you have questions or further feedback please use the talk page: <
https://www.mediawiki.org/wiki/Talk:Proposal_for_a_Technical_Decision_Makin…
>
Thanks,
-Kate
--
Kate Chapman (she/her/hers)
Director of Architecture, Architecture Team
Wikimedia Foundation
kchapman(a)wikimedia.org
Hello Puneet here,
I am interested and I want to contribute for the organization.Honestly new
to the open source contribution. As I have good knowledge about the HTML
CSS JAVASCRIPT and PHP as well. So please help me how I could
contribute for organization.
Hello,
A new feature called Reference Previews [1] was introduced as a beta
feature last year. After more than a year of testing, bug fixing and other
improvements it will soon leave the beta state in the first wikis, enabling
it by default for all users including readers. New features like this are
deployed in multiple batches, to ensure everything runs smoothly.
Therefore, we are currently looking for a few small or medium-sized wikis
that are interested in being among the first to gain the new feature.
If your community is interested, please reach out to me on my discussion
page or via email.
The feature shows you references in a small popup when you hover over the
reference number in square brackets. This way you can look up a reference
without jumping down to the bottom of the page. The feature is part of the
MediaWiki Popups extension [2], which is also used for the Page Previews
feature [3]. Both features can be turned on and off together by the user.
Several wikis have a gadget called “RefTooltips” to provide the same
functionality [4]. The new extension will disable itself if RefTooltips is
used. This way, it is up to the users and communities to decide if they
want to disable the gadget in favor of the new feature.
Reference Previews originated from a wish from the German-speaking wiki
community and was developed collaboratively by Wikimedia Deutschland the
Wikimedia Foundation. If you'd like to know more about this feature, please
visit the project page.[1]
For the Technical Wishes team,
Michael Schönitzer
1: https://meta.wikimedia.org/wiki/WMDE_Technical_Wishes/ReferencePreviews
2: https://www.mediawiki.org/wiki/Special:MyLanguage/Extension:Popups
3: https://www.mediawiki.org/wiki/Page_Previews
4: https://phabricator.wikimedia.org/T234204#6244041
--
*M. F. Schönitzer*
Community Communication
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
https://wikimedia.de
Unsere Vision ist eine Welt, in der alle Menschen am Wissen der Menschheit
teilhaben, es nutzen und mehren können. Helfen Sie uns dabei!
https://spenden.wikimedia.de
Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de
Wikimedia Deutschland – Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello,
This email contains updates for November 4, 2020
<https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-11-04>.
Cheers,
Deb
Callouts
- RelEng
- Train this week on a delayed schedule:
- Wed, 04 Nov 2020 noon PST: 1.36.0-wmf.16 Group0
- Thu, 05 Nov 2020 noon PST: 1.36.0-wmf.16 Group1
- Mon, 09 Nov 2020 noon PST: 1.36.0-wmf.16 Group2
- Train disruptions for the remainder of the year
<https://wikitech.wikimedia.org/wiki/Deployments#Upcoming_Release_Train_disr…>
TechnologyEngineering ProductivityRelease Engineering
- Updates:
- [All] Deployments/Covid-19 Wikitech:Deployments/Covid-19
<https://wikitech.wikimedia.org/wiki/Deployments/Covid-19>
- Train Health
- Last week: No Train
- This week: 1.36.0-wmf.16 - phab:T263182
<https://phabricator.wikimedia.org/T263182>
- Wed, 04 Nov 2020 noon PST: 1.36.0-wmf.16 Group0
- Thu, 05 Nov 2020 noon PST: 1.36.0-wmf.16 Group1
- Mon, 09 Nov 2020 noon PST: 1.36.0-wmf.16 Group2
- Next week: No Train
- Rest of the year:
https://wikitech.wikimedia.org/wiki/Deployments#Upcoming_Release_Train_disr…
Site Reliability Engineering
- Updates:
- Had some issues with wikifeeds and iOS, but are being worked out by
the team, many thanks for being on top of it.
- DC switchover went very smoothly
- CBC ciphers are now gone from public TLS termination, safer
connecting for everyone.
--
deb tankersley (she/her)
sr program manager, engineering
Wikimedia Foundation
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been
some confusion about when and how the deprecation process can be accelerated or
bypassed. I started a discussion about this issue on the talk page[2], and now
I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process,
if we have removed all usages of the obsolete code from public
extensions?
If you are affected by the answer to this question, or you otherwise have
opinions about it, please read on (ok ok, this mail is massive - at least read
the proposed new wording of the policy). I'm especially interested in the
opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major
MediaWiki version before being removed. It is RECOMMENDED to emit
hard deprecation notices for at least two major MediaWiki
versions. EXCEPTIONS to this are listed in the section "Removal
without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can
only be removed in version N+1, better even N+2. This effectively recommends
that obsolete code be kept around for at least half a year, with a preference
for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused
within the MediaWiki ecosystem. The ecosystem is defined to
consist of all actively maintained code residing in repositories
owned by the Wikimedia foundation, and can be searched using the
code search tool.
When TechCom added this section[3][4], we were thinking of the case where a
method becomes obsolete, but is unused. In that case, why go through all the
hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we just go
ahead and remove the usages, and then remove the code without deprecation? That
seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing
the amount of deprecated code we have to drag along and keep functional. This is
would be helpful particularly when code was refactored to remove undesirable
dependencies, since the dependency will not actually go away until the
deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process?
After all, if the code is truly unused, this would not do any harm, right? And
being able to make breaking changes without the need to wait a year for them to
become effective would greatly improve the speed at which we can modernize the
code base.
However, even skipping soft deprecation and going directly to hard deprecation
of the construction of the Revision class raised concerns, see for instance
<https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html>.
The key concern is that we can only know about usages in repositories in our
"ecosystem", a concept introduced into the policy by the section quoted above. I
will go into the implications of this further below. But first, let me propose a
change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or
appropriately gated) by any code in the MediaWiki ecosystem. Such
removal must be recorded in the release notes as a breaking change
without deprecation, and must be announced on the appropriate
mailing lists.
Obsolete code that is still used within the ecosystem MAY be
removed if it has been emitting deprecation warnings in AT LEAST
one major version release, and a best effort has been made to
remove any remaining usages in the MediaWiki ecosystem. Obsolete
code SHOULD be removed when it has been emitting deprecation
warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD
drive the removal of usages in a timely manner. For code not under
the control of this person, team, or organization, appropriate
changes SHOULD be proposed to the maintainers, and guidance SHOULD
be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete
code. Previously, we'd often just wait and hope that usages of deprecated
methods would vanish eventually. Which may take a long time, we still have code
in MediaWiki that was deprecated in 1.24. Of course, every now and then someone
fixes a bunch of usages of deprecated code, but this is a sporadic occurrence,
not designed into the process.
With the change I am proposing, whoever deprecates a function also commits to
removing usages of it asap. For extension developers, this means that they will
get patches and support, but they may see their code broken if they do not
follow up.
Now, my proposal hinges on the idea that we somehow know all relevant code that
needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy,
our reasoning was that we want to support primarily extension developers who
contribute their extensions back to the ecosystem, by making them available to
the public. We found it fair to say that if people develop extensions solely for
their own use, it is up to them to read the release notes. We do not need to go
out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of public
extensions will get more support keeping their extensions compatible, while
maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete
code, but only if we create a clear process for making an extensions "official".
How exactly would an extension developer make sure that we know their extension,
and consider it part of the ecosystem? In practice, "known code" is code
accessible via codesearch[5]. But how does one get an extension into the
codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
* create a page on mediawiki.org using the {{Extension}} infobox,
* setting the status to "stable" (and maybe "beta"),
* and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into
codesearch. A quick look at Category:Extensions_by_status category tells me that
there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to the
policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal
take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy
[2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v
[3] https://phabricator.wikimedia.org/T193613
[4] https://phabricator.wikimedia.org/T255803
[5] https://codesearch.wmcloud.org/search/
--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation