Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been some confusion about when and how the deprecation process can be accelerated or bypassed. I started a discussion about this issue on the talk page[2], and now I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
If you are affected by the answer to this question, or you otherwise have opinions about it, please read on (ok ok, this mail is massive - at least read the proposed new wording of the policy). I'm especially interested in the opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can only be removed in version N+1, better even N+2. This effectively recommends that obsolete code be kept around for at least half a year, with a preference for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.
When TechCom added this section[3][4], we were thinking of the case where a method becomes obsolete, but is unused. In that case, why go through all the hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we just go ahead and remove the usages, and then remove the code without deprecation? That seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing the amount of deprecated code we have to drag along and keep functional. This is would be helpful particularly when code was refactored to remove undesirable dependencies, since the dependency will not actually go away until the deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process? After all, if the code is truly unused, this would not do any harm, right? And being able to make breaking changes without the need to wait a year for them to become effective would greatly improve the speed at which we can modernize the code base.
However, even skipping soft deprecation and going directly to hard deprecation of the construction of the Revision class raised concerns, see for instance https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html.
The key concern is that we can only know about usages in repositories in our "ecosystem", a concept introduced into the policy by the section quoted above. I will go into the implications of this further below. But first, let me propose a change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists.
Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete code. Previously, we'd often just wait and hope that usages of deprecated methods would vanish eventually. Which may take a long time, we still have code in MediaWiki that was deprecated in 1.24. Of course, every now and then someone fixes a bunch of usages of deprecated code, but this is a sporadic occurrence, not designed into the process.
With the change I am proposing, whoever deprecates a function also commits to removing usages of it asap. For extension developers, this means that they will get patches and support, but they may see their code broken if they do not follow up.
Now, my proposal hinges on the idea that we somehow know all relevant code that needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy, our reasoning was that we want to support primarily extension developers who contribute their extensions back to the ecosystem, by making them available to the public. We found it fair to say that if people develop extensions solely for their own use, it is up to them to read the release notes. We do not need to go out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of public extensions will get more support keeping their extensions compatible, while maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to: * create a page on mediawiki.org using the {{Extension}} infobox, * setting the status to "stable" (and maybe "beta"), * and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to the policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy [2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v [3] https://phabricator.wikimedia.org/T193613 [4] https://phabricator.wikimedia.org/T255803 [5] https://codesearch.wmcloud.org/search/
On 8/28/20 11:18 AM, Daniel Kinzler wrote:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
I would support this, if only with the schadenfreude that MediaWiki will become harder for intelligence agencies and other closed-source shops to administer.
It seems totally reasonable that our "service level" is to guarantee upgradeability of our public components, with the only requirement that it must be performed one major version at a time. By definition, we can't guarantee anything about the non-public ecosystem.
Slightly off-topic, I don't see any reason to keep our suggestion to "even better" wait for two major revisions before removing interfaces. A non-binding suggestion doesn't seem useful, and I don't understand the use case which would be improved by waiting this extra time.
Regards, Adam
I'd like to see third party users, even those not on the mailing list, get advance notice in one release (say in the release notes) so that when the next release shows up with the deprecated code removed, they have had time to patch up any internal extensions and code they may have.
I don't want to penalize third parties who may not publish their extensions because they think the code is not good enough for public consumption or because it is very specific to their company or workflow.
I also don't want to encourage delays in updating, or the common practice of running very outdated versions of MediaWiki. Of course some folks will remain on LTS; that's what it's there for. But once a new release is out, we should want parties to be in a position to update to it immediately, at least as far as our processes go.
A delay of two releases is nice but not necessary and honestly I'd just skip that altogether.
Just my .02 €,
Ariel
On Fri, Aug 28, 2020 at 12:19 PM Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been some confusion about when and how the deprecation process can be accelerated or bypassed. I started a discussion about this issue on the talk page[2], and now I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
If you are affected by the answer to this question, or you otherwise have opinions about it, please read on (ok ok, this mail is massive - at least read the proposed new wording of the policy). I'm especially interested in the opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can only be removed in version N+1, better even N+2. This effectively recommends that obsolete code be kept around for at least half a year, with a preference for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.
When TechCom added this section[3][4], we were thinking of the case where a method becomes obsolete, but is unused. In that case, why go through all the hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we just go ahead and remove the usages, and then remove the code without deprecation? That seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing the amount of deprecated code we have to drag along and keep functional. This is would be helpful particularly when code was refactored to remove undesirable dependencies, since the dependency will not actually go away until the deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process? After all, if the code is truly unused, this would not do any harm, right? And being able to make breaking changes without the need to wait a year for them to become effective would greatly improve the speed at which we can modernize the code base.
However, even skipping soft deprecation and going directly to hard deprecation of the construction of the Revision class raised concerns, see for instance <https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html
.
The key concern is that we can only know about usages in repositories in our "ecosystem", a concept introduced into the policy by the section quoted above. I will go into the implications of this further below. But first, let me propose a change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists. Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete code. Previously, we'd often just wait and hope that usages of deprecated methods would vanish eventually. Which may take a long time, we still have code in MediaWiki that was deprecated in 1.24. Of course, every now and then someone fixes a bunch of usages of deprecated code, but this is a sporadic occurrence, not designed into the process.
With the change I am proposing, whoever deprecates a function also commits to removing usages of it asap. For extension developers, this means that they will get patches and support, but they may see their code broken if they do not follow up.
Now, my proposal hinges on the idea that we somehow know all relevant code that needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy, our reasoning was that we want to support primarily extension developers who contribute their extensions back to the ecosystem, by making them available to the public. We found it fair to say that if people develop extensions solely for their own use, it is up to them to read the release notes. We do not need to go out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of public extensions will get more support keeping their extensions compatible, while maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
- create a page on mediawiki.org using the {{Extension}} infobox,
- setting the status to "stable" (and maybe "beta"),
- and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to the policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy [2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v [3] https://phabricator.wikimedia.org/T193613 [4] https://phabricator.wikimedia.org/T255803 [5] https://codesearch.wmcloud.org/search/
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Would it be feasible to put the deprecation notices in an early release candidate, then encourage third party extension creators to try the release candidate with deprecation notices so they'll see where there are problems in their code, and what they have to do to be ready for the final release where deprecated features are removed?
Arthur
On Fri, Aug 28, 2020 at 7:41 AM Ariel Glenn WMF ariel@wikimedia.org wrote:
I'd like to see third party users, even those not on the mailing list, get advance notice in one release (say in the release notes) so that when the next release shows up with the deprecated code removed, they have had time to patch up any internal extensions and code they may have.
I don't want to penalize third parties who may not publish their extensions because they think the code is not good enough for public consumption or because it is very specific to their company or workflow.
I also don't want to encourage delays in updating, or the common practice of running very outdated versions of MediaWiki. Of course some folks will remain on LTS; that's what it's there for. But once a new release is out, we should want parties to be in a position to update to it immediately, at least as far as our processes go.
A delay of two releases is nice but not necessary and honestly I'd just skip that altogether.
Just my .02 €,
Ariel
On Fri, Aug 28, 2020 at 12:19 PM Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been some confusion about when and how the deprecation process can be accelerated or bypassed. I started a discussion about this issue on the talk page[2],
and
now I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
If you are affected by the answer to this question, or you otherwise have opinions about it, please read on (ok ok, this mail is massive - at least read the proposed new wording of the policy). I'm especially interested in the opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can only be removed in version N+1, better even N+2. This effectively recommends that obsolete code be kept around for at least half a year, with a preference for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.
When TechCom added this section[3][4], we were thinking of the case
where a
method becomes obsolete, but is unused. In that case, why go through all the hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we
just
go ahead and remove the usages, and then remove the code without
deprecation?
That seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing the amount of deprecated code we have to drag along and keep functional. This is would be helpful particularly when code was refactored to remove undesirable dependencies, since the dependency will not actually go away until the deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process? After all, if the code is truly unused, this would not do any harm,
right?
And being able to make breaking changes without the need to wait a year for them to become effective would greatly improve the speed at which we can
modernize
the code base.
However, even skipping soft deprecation and going directly to hard deprecation of the construction of the Revision class raised concerns, see for
instance
<
https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html
.
The key concern is that we can only know about usages in repositories in our "ecosystem", a concept introduced into the policy by the section quoted above. I will go into the implications of this further below. But first, let me propose a change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists. Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete code. Previously, we'd often just wait and hope that usages of deprecated methods would vanish eventually. Which may take a long time, we still
have
code in MediaWiki that was deprecated in 1.24. Of course, every now and then someone fixes a bunch of usages of deprecated code, but this is a sporadic occurrence, not designed into the process.
With the change I am proposing, whoever deprecates a function also
commits
to removing usages of it asap. For extension developers, this means that
they
will get patches and support, but they may see their code broken if they do
not
follow up.
Now, my proposal hinges on the idea that we somehow know all relevant
code
that needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy, our reasoning was that we want to support primarily extension developers who contribute their extensions back to the ecosystem, by making them available to the public. We found it fair to say that if people develop extensions solely for their own use, it is up to them to read the release notes. We do not need to go out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of
public
extensions will get more support keeping their extensions compatible,
while
maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
- create a page on mediawiki.org using the {{Extension}} infobox,
- setting the status to "stable" (and maybe "beta"),
- and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to
the
policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy [2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v [3] https://phabricator.wikimedia.org/T193613 [4] https://phabricator.wikimedia.org/T255803 [5] https://codesearch.wmcloud.org/search/
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Am 28.08.20 um 17:51 schrieb Arthur Smith:
Would it be feasible to put the deprecation notices in an early release candidate, then encourage third party extension creators to try the release candidate with deprecation notices so they'll see where there are problems in their code, and what they have to do to be ready for the final release where deprecated features are removed?
What you are suggesting sounds like an interesting option to consider - please let me know if I understand your idea correctly:
When code has become obsolete, and we have removed all known usages, we should not remove the old code immediately, but we can tag it for removal *before* the next release (rather than after, per the current policy). The obsolete functionality would remain intact (but emitting warnings) in some kind of alpha-release (even before the "release candidates").
Is that what you have in mind?
What I am wondering is - when people try the alpha release, how would they even notice the deprecation warnings? These warnings are disabled per default, because they would flood the log files on a production site. So the alpha release would have to be tested in a separate environment, with development warnings enabled, and someone actually looking at the log. Typically, people only look at logs after things break.
But if the pre-release is tested in a development environment, what's the advantage of a deprecation warning over a hard error? The only difference I see is the reported log level and type of exception. I'm not sure that's worth the effort.
The same question also arises for the existing long deprecation period. My impression is that the people who should benefit from the long deprecation either notice right away and quickly fix their code (so they don't need the long deprecation), or they don't notice until things break (so they don't need the long deprecation either).
So the alpha release would have to be tested in a separate environment, with development warnings enabled, and someone actually looking at the log. Typically, people only look at logs after things break.
Is that true? I thought deprecation warnings appeared directly when viewing a page that used the deprecated code - that was my recent experience of this with the WikiPage/Revision stuff that is deprecated in 1.35 - I was experimenting with an extension (in development mode) that hadn't fixed that issue, and the warnings appeared right there on every page.
Arthur
On Mon, Aug 31, 2020 at 6:51 AM Daniel Kinzler dkinzler@wikimedia.org wrote:
Am 28.08.20 um 17:51 schrieb Arthur Smith:
Would it be feasible to put the deprecation notices in an early release candidate, then encourage third party extension creators to try the
release
candidate with deprecation notices so they'll see where there are
problems
in their code, and what they have to do to be ready for the final release where deprecated features are removed?
What you are suggesting sounds like an interesting option to consider - please let me know if I understand your idea correctly:
When code has become obsolete, and we have removed all known usages, we should not remove the old code immediately, but we can tag it for removal *before* the next release (rather than after, per the current policy). The obsolete functionality would remain intact (but emitting warnings) in some kind of alpha-release (even before the "release candidates").
Is that what you have in mind?
What I am wondering is - when people try the alpha release, how would they even notice the deprecation warnings? These warnings are disabled per default, because they would flood the log files on a production site. So the alpha release would have to be tested in a separate environment, with development warnings enabled, and someone actually looking at the log. Typically, people only look at logs after things break.
But if the pre-release is tested in a development environment, what's the advantage of a deprecation warning over a hard error? The only difference I see is the reported log level and type of exception. I'm not sure that's worth the effort.
The same question also arises for the existing long deprecation period. My impression is that the people who should benefit from the long deprecation either notice right away and quickly fix their code (so they don't need the long deprecation), or they don't notice until things break (so they don't need the long deprecation either).
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Am 31.08.20 um 18:52 schrieb Arthur Smith:
So the alpha release would have to be tested in a separate environment, with development warnings enabled, and someone actually looking at the log. Typically, people only look at logs after things break.
Is that true? I thought deprecation warnings appeared directly when viewing a page that used the deprecated code - that was my recent experience of this with the WikiPage/Revision stuff that is deprecated in 1.35 - I was experimenting with an extension (in development mode) that hadn't fixed that issue, and the warnings appeared right there on every page.
Yes, in development model ($wgDeveloperWarnings = true), deprecation warnings are visible.
But very commonly, people don't actively work on this "hidden code" any more. They wrote it once, it's working, and they will not look at it again until it breaks. I'm not blaming them, that's what I do for "one off" code.
If they are actively developing features, then sure. But then they are likely to read release notes, or tests against master.
Hmm, maybe we're talking past one another here? I'm assuming a developer of an extension who is interested in testing a new release - if we have a version that has things deprecated vs completely removed, that allows a quick check to see if the deprecated code affects them without going back into their own code (which may have been developed partly by somebody else so just reading release notes wouldn't clue them in that there might be a problem).
Arthur
On Mon, Aug 31, 2020 at 1:23 PM Daniel Kinzler dkinzler@wikimedia.org wrote:
Am 31.08.20 um 18:52 schrieb Arthur Smith:
So the alpha release would have to be tested in a separate environment, with
development
warnings enabled, and someone actually looking at the log.
Typically, people
only look at logs after things break.
Is that true? I thought deprecation warnings appeared directly when
viewing a
page that used the deprecated code - that was my recent experience of
this with
the WikiPage/Revision stuff that is deprecated in 1.35 - I was
experimenting
with an extension (in development mode) that hadn't fixed that issue,
and the
warnings appeared right there on every page.
Yes, in development model ($wgDeveloperWarnings = true), deprecation warnings are visible.
But very commonly, people don't actively work on this "hidden code" any more. They wrote it once, it's working, and they will not look at it again until it breaks. I'm not blaming them, that's what I do for "one off" code.
If they are actively developing features, then sure. But then they are likely to read release notes, or tests against master.
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Hi Arthur!
We were indeed thinking of different scenarios. I was thinking of someone who runs a wiki with a couple of one-off private extensions running, and now wants to update. They may well test that everything is still working with the new version of MediaWiki, but I think they would be unlikely to test with development settings enabled. The upgrade guide doesn't mention this, and even if it did, I doubt many people would remember to enable it. So they won't notice deprecations until the code is removed.
I understand your scenario to refer to an extension developer explicitly testing whether their extension is working with the next release, and try it in their development environment. They would see the deprecation warnings, and address them. But in that scenario, would it be so much worse to see fatal errors instead of deprecation warnings?
This is not meant to be a loaded question. I'm trying to understand what the practical consequences would be. Fatal errors are of course less nice, but in a testing environment, not a real problem, right? I suppose deprecation warnings can provide better information that fatal errors would, but one can also find this information in the release notes, once it is clear what to look for.
Also note that this would only affect private extensions. Public extensions would receive support up front, and early removal of the obsolete code would be blocked until all known extensions are fixed.
Thank you for your thoughts! -- daniel
Am 31.08.20 um 20:54 schrieb Arthur Smith:
Hmm, maybe we're talking past one another here? I'm assuming a developer of an extension who is interested in testing a new release - if we have a version that has things deprecated vs completely removed, that allows a quick check to see if the deprecated code affects them without going back into their own code (which may have been developed partly by somebody else so just reading release notes wouldn't clue them in that there might be a problem).
Honestly if you want a depreciation policy, warnings need to be omitted for at least one 1.x version. Anything less than that is pointless from an end user perspective. We tend to wait for final releases to limit bug exposure. If something breaks, and it's not clear exactly what the cause is, using incremental updates to figure out the breakage is the solution normally applied. One of the key reasons to use a depreciation policy is to limit the breakage that can happen, if the Mediawiki culture is shifting to a screw non-published extensions policy might as well not have a depreciation policy. However if the historical spirit of MW is maintained having such a policy is critical. Mediawiki is re-used by a lot of different groups, not all of them are able to or are willing to publish extension code for a number of reasons. Taking the "Not my Problem" approach leaves a sour taste in my mouth. Honestly if you cannot maintain compatibility for at least one release cycle, how much damage are you going to create?
On Tue, Sep 1, 2020 at 8:58 AM Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi Arthur!
We were indeed thinking of different scenarios. I was thinking of someone who runs a wiki with a couple of one-off private extensions running, and now wants to update. They may well test that everything is still working with the new version of MediaWiki, but I think they would be unlikely to test with development settings enabled. The upgrade guide doesn't mention this, and even if it did, I doubt many people would remember to enable it. So they won't notice deprecations until the code is removed.
I understand your scenario to refer to an extension developer explicitly testing whether their extension is working with the next release, and try it in their development environment. They would see the deprecation warnings, and address them. But in that scenario, would it be so much worse to see fatal errors instead of deprecation warnings?
This is not meant to be a loaded question. I'm trying to understand what the practical consequences would be. Fatal errors are of course less nice, but in a testing environment, not a real problem, right? I suppose deprecation warnings can provide better information that fatal errors would, but one can also find this information in the release notes, once it is clear what to look for.
Also note that this would only affect private extensions. Public extensions would receive support up front, and early removal of the obsolete code would be blocked until all known extensions are fixed.
Thank you for your thoughts! -- daniel
Am 31.08.20 um 20:54 schrieb Arthur Smith:
Hmm, maybe we're talking past one another here? I'm assuming a developer
of an
extension who is interested in testing a new release - if we have a
version that
has things deprecated vs completely removed, that allows a quick check
to see if
the deprecated code affects them without going back into their own code
(which
may have been developed partly by somebody else so just reading release
notes
wouldn't clue them in that there might be a problem).
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I like the idea of streamlining deprecation and avoiding the cost of maintaining obsolete code. I also **want** to publish my code on Gerrit.
As a 3rd-party extension developer who doesn't write a lot of code, one of the biggest complaints that I have is that it's "hard" to publish your work in Gerrit (and benefit from the visibility of being in the MediaWiki ecosystem). It's very easy to create a new repo at GitHub. It would be wonderful if there was some facility to "import" GitHub repos into Gerrit. (aside: I just read the gerrit TL;DR [1] and like how it's concise and to the point.) The same goes for gerrit code review. It's easy to create or review pull-requests on GitHub, but unfamiliar/awkward in Gerrit. I understand that a transition to GitLab is underway, and welcome the usability improvements that I anticipate from that. Of course I don't expect WMF to be able to duplicate the GitHub experience, but the more that can be done to improve the contribution workflows, browser-based interaction, and collaboration or integration with GitHub; the better.
This may all seem tangential to the topic of the Stable Interface policy, but I believe there is a substantial "invisible" ecosystem of extensions developed by third-party developers. Most wikis I've encountered have developed in-house extensions of some kind. The harder it is for them to maintain their code to conform with MW development and deprecations, the more 3rd-party wikis will move to alternatives like Confluence or Microsoft SharePoint.
[1] https://www.mediawiki.org/wiki/Gerrit/Tutorial/tl;dr
Greg Rundlett https://eQuality-Tech.com https://freephile.org
On Fri, Aug 28, 2020 at 7:41 AM Ariel Glenn WMF ariel@wikimedia.org wrote:
I'd like to see third party users, even those not on the mailing list, get advance notice in one release (say in the release notes) so that when the next release shows up with the deprecated code removed, they have had time to patch up any internal extensions and code they may have.
I don't want to penalize third parties who may not publish their extensions because they think the code is not good enough for public consumption or because it is very specific to their company or workflow.
I also don't want to encourage delays in updating, or the common practice of running very outdated versions of MediaWiki. Of course some folks will remain on LTS; that's what it's there for. But once a new release is out, we should want parties to be in a position to update to it immediately, at least as far as our processes go.
A delay of two releases is nice but not necessary and honestly I'd just skip that altogether.
Just my .02 €,
Ariel
On Fri, Aug 28, 2020 at 12:19 PM Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been some confusion about when and how the deprecation process can be accelerated or bypassed. I started a discussion about this issue on the talk page[2],
and
now I'm writing this email in the hope of gathering more perspectives.
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
If you are affected by the answer to this question, or you otherwise have opinions about it, please read on (ok ok, this mail is massive - at least read the proposed new wording of the policy). I'm especially interested in the opinions of extension developers.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can only be removed in version N+1, better even N+2. This effectively recommends that obsolete code be kept around for at least half a year, with a preference for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.
When TechCom added this section[3][4], we were thinking of the case
where a
method becomes obsolete, but is unused. In that case, why go through all the hassle of deprecation, if nobody uses it anyway?
However, what does this mean for obsolete code that *is* used? Can we
just
go ahead and remove the usages, and then remove the code without
deprecation?
That seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing the amount of deprecated code we have to drag along and keep functional. This is would be helpful particularly when code was refactored to remove undesirable dependencies, since the dependency will not actually go away until the deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process? After all, if the code is truly unused, this would not do any harm,
right?
And being able to make breaking changes without the need to wait a year for them to become effective would greatly improve the speed at which we can
modernize
the code base.
However, even skipping soft deprecation and going directly to hard deprecation of the construction of the Revision class raised concerns, see for
instance
<
https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html
.
The key concern is that we can only know about usages in repositories in our "ecosystem", a concept introduced into the policy by the section quoted above. I will go into the implications of this further below. But first, let me propose a change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists. Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete code. Previously, we'd often just wait and hope that usages of deprecated methods would vanish eventually. Which may take a long time, we still
have
code in MediaWiki that was deprecated in 1.24. Of course, every now and then someone fixes a bunch of usages of deprecated code, but this is a sporadic occurrence, not designed into the process.
With the change I am proposing, whoever deprecates a function also
commits
to removing usages of it asap. For extension developers, this means that
they
will get patches and support, but they may see their code broken if they do
not
follow up.
Now, my proposal hinges on the idea that we somehow know all relevant
code
that needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy, our reasoning was that we want to support primarily extension developers who contribute their extensions back to the ecosystem, by making them available to the public. We found it fair to say that if people develop extensions solely for their own use, it is up to them to read the release notes. We do not need to go out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of
public
extensions will get more support keeping their extensions compatible,
while
maintainers of private extensions will receive less consideration.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
- create a page on mediawiki.org using the {{Extension}} infobox,
- setting the status to "stable" (and maybe "beta"),
- and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
So, my question to you is: do you support the change I am proposing to
the
policy? If not, why not? And if you do, why do you think it's helpful?
-- daniel
PS: This proposal has not yet been vetted with TechCom, it's just my personal take. It will become an RFC if needed. This is intended to start a conversation.
[1] https://www.mediawiki.org/wiki/Stable_interface_policy [2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v [3] https://phabricator.wikimedia.org/T193613 [4] https://phabricator.wikimedia.org/T255803 [5] https://codesearch.wmcloud.org/search/
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi Greg, thanks for your reply!
Am 28.08.20 um 18:26 schrieb Greg Rundlett (freephile):
I like the idea of streamlining deprecation and avoiding the cost of maintaining obsolete code. I also **want** to publish my code on Gerrit.
Just a quick clarification: while the current policy only considers code to be part of the "ecosystem" if it's on gerrit, what I proposed in my mail would mean that the Extension could be hosted anywhere, as long as it is public, and has a page on mediawiki.org
Hi Daniel,
I support your proposal.
Re: Ariel I appreciate your argument, however, I think the deprecation policy will be used in good faith. Fast deprecations are really helpful for code that is not been used. If one expects that a feature is used in hidden code probably people will not depreciate it too fast, especially if there is a lot of visible code to refactor.
Best Moritz (physikerwelt)
http://moritzschubotz.de | +49 1578 047 1397
On Fri, Aug 28, 2020 at 9:16 PM Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi Greg, thanks for your reply!
Am 28.08.20 um 18:26 schrieb Greg Rundlett (freephile):
I like the idea of streamlining deprecation and avoiding the cost of maintaining obsolete code. I also **want** to publish my code on Gerrit.
Just a quick clarification: while the current policy only considers code to be part of the "ecosystem" if it's on gerrit, what I proposed in my mail would mean that the Extension could be hosted anywhere, as long as it is public, and has a page on mediawiki.org
-- Daniel Kinzler Principal Software Engineer, Core Platform Wikimedia Foundation
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Am 28.08.20 um 21:47 schrieb Physikerwelt:
I appreciate your argument, however, I think the deprecation policy will be used in good faith. Fast deprecations are really helpful for code that is not been used. If one expects that a feature is used in hidden code probably people will not depreciate it too fast, especially if there is a lot of visible code to refactor.
Hi Moritz!
I think you are toughing on the core of the issue: We need to figure out mow much we care about "hidden usages" in code that is not shared back to the community.
For a long time, the answer has been "very much", so we worked as if MediaWiki was a framework developed for other people's use, providing a maximum of backwards compatibility. However, this comes with a very real cost in terms of development speed and code complexity.
The other extreme would be saying "not at all". Then we wouldn't need release notes. Maybe we wouldn't even need releases. That would be a rather harsh.
Perhaps it would be helpful to be more specific about what code we are talking about. I think that for code we release as standalone libraries, we should ensure compliance with the principles of semantic versioning, and avoid inconvenience for 3rd party users.
However, for MediaWiki core, I have come around to thinking that we should not allow ourselves to be held back too much by the needs of "hidden usages". We really need to modernize the codebase, and that means breaking changes. Dragging along a compatibility layer means we cannot benefit from the changes we have made until we can drop that layer. So I'd rather that be months, not years, as it has been in the past.
So, for core, I think we should only care about usage in non-public code "a little". In exchange, we should better support updating 3rd code that is public.
On 28/08/2020 18:26, Greg Rundlett (freephile) wrote:
I like the idea of streamlining deprecation and avoiding the cost of maintaining obsolete code. I also **want** to publish my code on Gerrit.
As a 3rd-party extension developer who doesn't write a lot of code, one of the biggest complaints that I have is that it's "hard" to publish your work in Gerrit (and benefit from the visibility of being in the MediaWiki ecosystem). It's very easy to create a new repo at GitHub. It would be wonderful if there was some facility to "import" GitHub repos into Gerrit.
<snip>
Hello,
Indeed repository creation is restricted to a handful of people since the repositories are shared among every users, unlike Github in which you have your own user/org namespace in which you can create any repositories as you want.
A request for a new repository can be made on the wiki page: https://www.mediawiki.org/wiki/Gerrit/New_repositories/Requests
And it can be asked to import an existing Github repository, which is merely about:
git clone --mirror <github url> git push --mirror <gerrit url>
https://www.mediawiki.org/wiki/Git/Creating_new_repositories#Importing_from_...
From there eventually you will get the benefit of our homegrown CI
system and routine maintenance by various bots (localization updates, deprecation cleanup, libraries updates, latest code styling etc).
The Gerrit and Github models are not that different. While in the github model one does:
1) fork repository 2) push to a user branch 3) request pull request 4) amend OR add commits and push until change is merged
In Gerrit that is:
1) clone repository 2) push to the special refs/for/<target branch> 3) **amend** and push until change is merged
I am eluding the maintenance of a serie of commits which is arguably easier in the Github model since each pull request is for a whole branch.
There are some basics at:
https://www.mediawiki.org/wiki/Gerrit/Tutorial/tl;dr
Still, yes the repository creation is a bit annoying and maybe that can be toolized.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
Ideally, it would be sufficient to:
- create a page on mediawiki.org using the {{Extension}} infobox,
- setting the status to "stable" (and maybe "beta"),
- and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
A clear and straightforward policy for getting things "in" sounds great. However, this might encourage the addition of extensions that are ultimately abandoned and which themselves become a code maintenance burden. We should also consider our policy for getting things "out". This is often a more difficult issue.
Bill Pirkle Software Engineer www.wikimediafoundation.org
On 31/08/2020 18:58, Bill Pirkle wrote:
A clear and straightforward policy for getting things "in" sounds great. However, this might encourage the addition of extensions that are ultimately abandoned and which themselves become a code maintenance burden. We should also consider our policy for getting things "out". This is often a more difficult issue.
Hello,
Abandoned extensions are definitely a burden. The good news is that we do archive them eventually. The bulk of the work is done via the #cleanup project in Phabricator.
Several people take care of investigating public usage, interest by past author or maybe they got superseeded by another extension. The bulk of the work is done in Phabricator under #cleanup
https://phabricator.wikimedia.org/tag/projects-cleanup/
(the 'Fill an archive request' prepopulate the task form with a check list of actions to complete in order to have a repository fully archived).
Hi,
On 2020-08-28 02:18, Daniel Kinzler wrote:
tl;dr: the key question is:
Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?
I think going down this road would be a mistake, mostly because it's hides the real problem of providing stable, long-term APIs. Core is the platform that extensions are built on top of. If core is constantly changing, it becomes really hard to make stable extensions. I think the mentality should be "how do we make this more stable for developers without adding too much burden on core devs?" rather than figuring out how core devs can break stuff as fast as possible.
If MediaWiki, plus the extensions someone wants to use, isn't stable, sysadmins are either going to stay on an older version, or stop using MediaWiki. We don't want either of those.
A few years ago it was fully possible to write a basic parser tag/hook extension that would've worked fine with no changes from ~1.17 (post-ResourceLoader) to ~1.31 (pre-MediaWikiServices-ification). Is that still possible? If not, can we make it possible? Most extension developers aren't full time MediaWiki devs, and we shouldn't make that a requirement, even implicitly.
When we change an API, why is it impossible to keep the old functions around for a year? In some complicated cases sure, it's not fully possible. But in most cases I think it is. We have to write the back-compat code anyways for deployment purposes, so keeping it around isn't really extra work.
Broadly I think it's important to remember that every single deprecation has a cost larger than the actual rote code change: developers have to update their mental model to use the new code, which is probably the most expensive cost we have to pay.
I also think before getting set to speed up the deprecation policy, we should be seeing if the current one works well. The impression I'm getting is that more and more people are sticking to LTS releases because extensions, skins, etc. are breaking every upgrade and it's just less work to fix it every 2-3 years rather than every 6 months.
So, let's dive in. On the one hand, the new (and old) policy states:
Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.
This means that code that starts to emit a deprecation warning in version N can only be removed in version N+1, better even N+2. This effectively recommends that obsolete code be kept around for at least half a year, with a preference for a full year and more. However, we now have this exception in place:
The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.
When TechCom added this section[3][4], we were thinking of the case where a method becomes obsolete, but is unused.
I would note that the original deprecation policy I wrote had this clause in it: "Extension developers are encouraged to mirror their code into Wikimedia's Gerrit/Phabricator/Github to make it easier for core developers to identify usage patterns. Extensions that are open source will be given more consideration than those that core developers cannot see." That explicitly said open source extensions are favored, but didn't leave private ones out entirely.
I wasn't involved in the drafting of this new exception, but it has several flaws that immediately jump out to me:
1) "actively maintained" is a trap, code can be passively maintained because it's small enough to be stable. But that doesn't mean it should be excluded. 2) I would hope there are no repositories that are solely "owned" by the Wikimedia Foundation. Defining the "MediaWiki ecosystem" as code owned by the WMF is mind-boggling.
However, what does this mean for obsolete code that *is* used? Can we just go ahead and remove the usages, and then remove the code without deprecation? That seems to be the logical consequence.
The result is a much tighter timeline from soft deprecation to removal, reducing the amount of deprecated code we have to drag along and keep functional. This is would be helpful particularly when code was refactored to remove undesirable dependencies, since the dependency will not actually go away until the deprecated code has been removed.
So, if we put in the work to remove usages, can we skip the deprecation process? After all, if the code is truly unused, this would not do any harm, right? And being able to make breaking changes without the need to wait a year for them to become effective would greatly improve the speed at which we can modernize the code base.
However, even skipping soft deprecation and going directly to hard deprecation of the construction of the Revision class raised concerns, see for instance https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html.
The Revision class is (was) probably the second or third most used class after Title and User. That seems like a bad target to use as a case study for speeding up deprecation. Those cases should go slower rather than faster.
The key concern is that we can only know about usages in repositories in our "ecosystem", a concept introduced into the policy by the section quoted above. I will go into the implications of this further below. But first, let me propose a change to the policy, to clarify when deprecation is or is not needed.
I propose that the policy should read:
Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists. Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.
And further:
The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.
Compared to the old process, this puts more focus on removing usages of obsolete code. Previously, we'd often just wait and hope that usages of deprecated methods would vanish eventually. Which may take a long time, we still have code in MediaWiki that was deprecated in 1.24. Of course, every now and then someone fixes a bunch of usages of deprecated code, but this is a sporadic occurrence, not designed into the process.
In the past we were able to take advantage of Google Code-In students that usually made a ton of progress in this area, but that's not really a possibility anymore :(
With the change I am proposing, whoever deprecates a function also commits to removing usages of it asap. For extension developers, this means that they will get patches and support, but they may see their code broken if they do not follow up.
This used to be common practice before the 1.29 Deprecation policy came into effect, but it was removed because people (including myself) thought it was too burdensome. You want to make a change to core to fix something, and now you're on the hook for updating 50 extensions too? No thanks, we'll just hack around it in our extension and move on. I don't think we should go back to that model.
Now, my proposal hinges on the idea that we somehow know all relevant code that needs fixing. How can that work?
When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy, our reasoning was that we want to support primarily extension developers who contribute their extensions back to the ecosystem, by making them available to the public. We found it fair to say that if people develop extensions solely for their own use, it is up to them to read the release notes. We do not need to go out of our way to protect them from changes to the code base.
Effectively, with the proposed change to the policy, maintainers of public extensions will get more support keeping their extensions compatible, while maintainers of private extensions will receive less consideration.
As written, maintainers of private extensions get *no* consideration.
And when we say private, it's not always intentionally private. Consider LocalSettings.php hacks or someone making live server hacks and forgetting to commit.
It seems desirable and fair to me to allow for "fast track" removal of obsolete code, but only if we create a clear process for making an extensions "official". How exactly would an extension developer make sure that we know their extension, and consider it part of the ecosystem? In practice, "known code" is code accessible via codesearch[5]. But how does one get an extension into the codesearch index? There is currently no clear process for this.
On https://www.mediawiki.org/wiki/Codesearch#Included_repositories: "Additional repositories can be added upon request". Please let me know (or edit the page) if something isn't clear about that.
Ideally, it would be sufficient to:
- create a page on mediawiki.org using the {{Extension}} infobox,
- setting the status to "stable" (and maybe "beta"),
- and linking to a public git repository.
It should be simple enough to create a script that feeds these repos into codesearch. A quick look at Category:Extensions_by_status category tells me that there are about a thousand such extensions.
And how do we determine if the code is still being maintained or used? That's already starting to become a problem: https://phabricator.wikimedia.org/T241320. I suspect if you wrote said script, the quality and usability of search results would drop drastically.
This also is a big step towards centralization (you only get support if your extension is listed on mediawiki.org), which I think most of us are opposed to philosophically.
-- Legoktm
Hi!
For the BlueSpice distribution ...
* we have got ~90 active repos hosted on WMF Gerrit and another ~10 in our internal Gitlab
* we want to develop as much as possible on the public infrastructure of the WMF, so the remaining internal repos will (hopefully) be published in the future
* we have hundreds of installations and a lot of them have a custom extension/skin maintained by us, that can not be published
* we mainly target MediaWiki core LTS branches with our code, therefore deprecations and breaking changes introduced by a non-LTS usually are no problem. But ...
* we try to keep up with the changes in the core `master` branch mainly for two reasons:
1) CI tests of our codes `master` branch will fail whenever we cherry-pick a change from the LTS branch
2) There is less work left for the time when a new LTS is about to be released (just happening with 1.35)
I believe the proposed process change will not affect us much. We will benefit more from a cleaner/modernized MediaWiki core codebase more than we will suffer from breaking changes. Especially as a lot of our code will be in the public available locations and therefore be visible for the WMF anyways. As long as we have a way to know about such breaking changes (mailin-list, mw.org pages, release notes, ...) - and how to handle them - we should be good.
--
Robert Vogel
On Thu, Sep 3, 2020 at 7:26 AM Robert Vogel vogel@hallowelt.com wrote:
For the BlueSpice distribution ...
- we have got ~90 active repos hosted on WMF Gerrit and another ~10 in our
internal Gitlab
- we want to develop as much as possible on the public infrastructure of
the WMF, so the remaining internal repos will (hopefully) be published in the future
Note that any public Git repo would suffice for the purposes of this proposal. The Codesearch tool mainly used for this in practice, already indexes various third-party hosted Git repositories, including some hosted on GitHub.com, for example. https://codesearch.wmcloud.org/search/
You can file a task for additional repos to be indexed: https://phabricator.wikimedia.org/tag/vps-project-codesearch/
-- Timo
On Fri, 28 Aug 2020 at 11:19, Daniel Kinzler dkinzler@wikimedia.org wrote:
Hi all!
Since the new Stable Interface Policy[1] has come into effect, there has been some confusion about when and how the deprecation process can be accelerated or bypassed.
The SIP is very well written, great work! Recently I've found out it could include some guidance on how to deprecate the overriding of methods. This is a corner case of method deprecation, that has to be done quite differently. The lack of instructions for this case led to confusion and the unintended removal of a step in stylesheet loading, breaking about 30 non-core skins. Sidenote: The skins are still broken and this has to be fixed before 1.36-beta (see T266735 https://phabricator.wikimedia.org/T266735).
I'd like to fill this gap so I've prepared a discussion of how to formalize this in T267085 https://phabricator.wikimedia.org/T267085. The information necessary to be included in the SIP is compiled in the "Documentation" section. I assume the writers of the SIP will have a clear vision how to include it and I'd like to ask you to share it on the ticket.
The technical solution (T267080 https://phabricator.wikimedia.org/T267080) uses a helper trait per Danny's suggestion. Feedback and reviews are welcome.
Code MUST emit hard deprecation notices for at least one major
MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions.
The "Deprecation https://www.mediawiki.org/wiki/Stable_interface_policy#Deprecation" section of the SIP is very well polished IMO, the steps reflect what common sense and best practices dictate. The steps sound obvious to me therefore I don't understand how did we end up breaking this many skins without a warning, release note, email to wikitech or any clue of why those skins turned into a webpage from 1995.
Even more concerning is that after I reported the issue nothing happened for 3 months. At that time I submitted a patch which was claimed to be "unnecessary". Having a background in startups and generally very result-oriented projects, I feel like this dynamic is very unfamiliar to me and I find it hard to understand.
It seems to me this won't be the last such case. How should we prevent issues that break the SIP and how to handle and report issues if those happen despite our best efforts? One idea: there should be a project tag in phabricator for "Stable Interface". I'm interested in your ideas.
— Demian. I solve problems.
wikitech-l@lists.wikimedia.org