Hi,
On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
> changed:
>
> https://bluejeans.com/396234560
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
-Jeremy
Hello,
can someone to update list https://phabricator.wikimedia.org/P10500 which
contains repositories which haven't mediawiki/mediawiki-codesniffer.
I found in list that much repositories are empty, and repositories which
aren't available on Gerrit.
So, can someone please update this list of repositories (in
mediawiki/extensions) which haven't mediawiki/mediawiki-codesniffer, but at
least, contains one PHP file. or to provide me command with which I can
update list when I want, so I don't need to request it every time.
Best regards,
Zoran.
P. S.: Happy weekend! :)
Every year or so the Cloud Services team tries to identify and clean up
unused projects and VMs. We do this via an opt-in process: anyone can
mark a project as 'in use,' and that project will be preserved for
another year.
I've created a wiki page the lists all existing projects, here:
https://wikitech.wikimedia.org/wiki/News/Cloud_VPS_2020_Purge
If you are a VPS user, please visit that page and mark any projects that
you use as {{Used}}. Note that it's not necessary for you to be a
project admin to mark something -- if you know that you're currently
using a resource and want to keep using it, go ahead and mark it
accordingly. If you /are/ a project admin, please take a moment to mark
which VMs are or aren't used in your projects.
When December arrives, I will shut down and begin the process of
reclaiming resources from unused projects.
If you think you use a VPS project but aren't sure which, I encourage
you to poke around on https://tools.wmflabs.org/openstack-browser/ to
see what looks familiar. Worst case, just email
cloud(a)lists.wikimedia.org with a description of your use case and we'll
sort it out there.
Exclusive toolforge users are free to ignore this task.
Thank you!
-Andrew and WMCS team
Hi All,
Over the past months I've been looking at how we make technical decisions.
Wikimedia has significantly grown as a technical movement over the years
and similar to how we scaled our technical systems we need to scale our
human systems. I've proposed an evolution of the existing TechCom process.
This is based on the past couple years that I've been involved in the
facilitation of TechCom and research into decision making frameworks. I'm
looking for your feedback over the next three weeks (until 5 November) to
help me improve the proposed process.
Key objectives of this evolved process is:
- Make the process more inclusive by shifting to representation by
teams/groups instead of individuals
- Have clear timelines for when a decision will be made
- Being clear upfront about what stakeholders will be engaged
- Developing a clear lifecycle of a decision
This plan also introduces:
- Technical Decision Forum which will be composed of representatives of
teams from Product and Tech, WMDE and independent +2 contributors.
- Templates for decision statement overviews and decision records
Please see the draft process here: <
https://www.mediawiki.org/wiki/Proposal_for_a_Technical_Decision_Making_Pro…
>
If you have questions or further feedback please use the talk page: <
https://www.mediawiki.org/wiki/Talk:Proposal_for_a_Technical_Decision_Makin…
>
Thanks,
-Kate
--
Kate Chapman (she/her/hers)
Director of Architecture, Architecture Team
Wikimedia Foundation
kchapman(a)wikimedia.org
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been
some confusion about when and how the deprecation process can be accelerated or
bypassed. I started a discussion about this issue on the talk page[2], and now
I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process,
if we have removed all usages of the obsolete code from public
extensions?
If you are affected by the answer to this question, or you otherwise have
opinions about it, please read on (ok ok, this mail is massive - at least read
the proposed new wording of the policy). I'm especially interested in the
opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major
MediaWiki version before being removed. It is RECOMMENDED to emit
hard deprecation notices for at least two major MediaWiki
versions. EXCEPTIONS to this are listed in the section "Removal
without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can
only be removed in version N+1, better even N+2. This effectively recommends
that obsolete code be kept around for at least half a year, with a preference
for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused
within the MediaWiki ecosystem. The ecosystem is defined to
consist of all actively maintained code residing in repositories
owned by the Wikimedia foundation, and can be searched using the
code search tool.
When TechCom added this section[3][4], we were thinking of the case where a
method becomes obsolete, but is unused. In that case, why go through all the
hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we just go
ahead and remove the usages, and then remove the code without deprecation? That
seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing
the amount of deprecated code we have to drag along and keep functional. This is
would be helpful particularly when code was refactored to remove undesirable
dependencies, since the dependency will not actually go away until the
deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process?
After all, if the code is truly unused, this would not do any harm, right? And
being able to make breaking changes without the need to wait a year for them to
become effective would greatly improve the speed at which we can modernize the
code base.
However, even skipping soft deprecation and going directly to hard deprecation
of the construction of the Revision class raised concerns, see for instance
<https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html>.
The key concern is that we can only know about usages in repositories in our
"ecosystem", a concept introduced into the policy by the section quoted above. I
will go into the implications of this further below. But first, let me propose a
change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or
appropriately gated) by any code in the MediaWiki ecosystem. Such
removal must be recorded in the release notes as a breaking change
without deprecation, and must be announced on the appropriate
mailing lists.
Obsolete code that is still used within the ecosystem MAY be
removed if it has been emitting deprecation warnings in AT LEAST
one major version release, and a best effort has been made to
remove any remaining usages in the MediaWiki ecosystem. Obsolete
code SHOULD be removed when it has been emitting deprecation
warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD
drive the removal of usages in a timely manner. For code not under
the control of this person, team, or organization, appropriate
changes SHOULD be proposed to the maintainers, and guidance SHOULD
be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete
code. Previously, we'd often just wait and hope that usages of deprecated
methods would vanish eventually. Which may take a long time, we still have code
in MediaWiki that was deprecated in 1.24. Of course, every now and then someone
fixes a bunch of usages of deprecated code, but this is a sporadic occurrence,
not designed into the process.
With the change I am proposing, whoever deprecates a function also commits to
removing usages of it asap. For extension developers, this means that they will
get patches and support, but they may see their code broken if they do not
follow up.
Now, my proposal hinges on the idea that we somehow know all relevant code that
needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy,
our reasoning was that we want to support primarily extension developers who
contribute their extensions back to the ecosystem, by making them available to
the public. We found it fair to say that if people develop extensions solely for
their own use, it is up to them to read the release notes. We do not need to go
out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of public
extensions will get more support keeping their extensions compatible, while
maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete
code, but only if we create a clear process for making an extensions "official".
How exactly would an extension developer make sure that we know their extension,
and consider it part of the ecosystem? In practice, "known code" is code
accessible via codesearch[5]. But how does one get an extension into the
codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
* create a page on mediawiki.org using the {{Extension}} infobox,
* setting the status to "stable" (and maybe "beta"),
* and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into
codesearch. A quick look at Category:Extensions_by_status category tells me that
there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to the
policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal
take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy
[2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v
[3] https://phabricator.wikimedia.org/T193613
[4] https://phabricator.wikimedia.org/T255803
[5] https://codesearch.wmcloud.org/search/
--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation
The Search Platform Team
<https://www.mediawiki.org/wiki/Wikimedia_Search_Platform> usually holds
office hours the first Wednesday of each month. Come talk to us about
anything related to Wikimedia search, Wikidata Query Service, Wikimedia
Commons Query Service, etc.!
Feel free to add your items to the Etherpad Agenda for the next meeting.
Details for our next meeting:
Date: Wednesday, November 4th, 2020
Time: 16:00-17:00 GMT / 08:00-09:00 PST / 11:00-12:00 EST / 17:00-18:00 CET
Etherpad: https://etherpad.wikimedia.org/p/Search_Platform_Office_Hours
Google Meet link: https://meet.google.com/vyc-jvgq-dww
Join by phone in the US: +1 786-701-6904 PIN: 262 122 849#
Hope to talk to you in a week!
—Trey
Trey Jones
Sr. Computational Linguist, Search Platform
Wikimedia Foundation
UTC-4 / EDT
I maintain spi-tools.js <https://en.wikipedia.org/wiki/User:RoySmith/spi-tools.js>. The source is in github. At the moment, my "release process" (if you could call it that) is to edit
User:RoySmith/spi-tools.js and copy-paste the new version. This works, but it's clunky. Is there some pre-existing tool for this?
I could build some little tool to to do this, but if something already exists, no need to reinvent the wheel.
Hi,
I'm looking at solving the following console warning on ro.wp:
"JQMIGRATE: jQuery.parseJSON is deprecated; use JSON.parse" which
appears due to outdated Twinkle code. Just making the replacement does
not work, since JSON is not defined. As a matter of fact, I cannot
find it anywhere else in the code loading on a normal Romanian
Wikipedia page.
Alas, the generic name of that object makes searching on mw.org or
Google rather useless. I can see some similar changes in Phabricator,
but they seem to work.
So, what is JSON and how can I use it in my code?
Thanks,
Strainu
P.S. Please don't suggest updating Twinkle...
Hi all,
We've entered that special time of year where the weekly deployment train
has a few disruptions coming up.
I've added a list of train disruptions through the end of the year to the
Deployments page on Wikitech[0].
Next week's train will happen, but the schedule will be different allowing
for a Tuesday holiday:
* Wed, 04 Nov 2020 noon PST: 1.36.0-wmf.16 Group0
* Thu, 05 Nov 2020 noon PST: 1.36.0-wmf.16 Group1
* Mon, 09 Nov 2020 noon PST: 1.36.0-wmf.16 Group2
Train will resume on the week of 2020-11-17 with 1.36.0-wmf.18.
As always, the Deployment Calendar on Wikitech[0] is the best source for
this information.
Thanks all
-- Tyler
[0]: <
https://wikitech.wikimedia.org/wiki/Deployments#Upcoming_Release_Train_disr…
>