Cross-posted to http://techblog.wikimedia.org/2010/07/mediawiki-version-statistics/
Some kind people at Qualys have surveyed versions of open source web apps present on the web, including MediaWiki. Here is the relevant page from their presentation:
For the original see:
https://community.qualys.com/docs/DOC-1401
And the press release:
http://www.qualys.com/company/newsroom/newsreleases/usa/view/2010-07-28/
They make the point that 95% of MediaWiki installations have a "serious vulnerability", whereas only 4% of WordPress installations do. While WordPress's web-based upgrade utility certainly has a positive impact on security, I feel I should point out that what WordPress counts as a serious vulnerability does not align with MediaWiki's definition of the same term.
For instance, if a web-based user could execute arbitrary PHP code on the server, compromising all data and user accounts, we would count that as the most serious sort of vulnerability, and we would do an immediate release to fix it. We're proud of the fact that we haven't had any such vulnerability in a stable release since 1.5.3 (December 2005).
However in WordPress, they count this as a feature, and all administrators can do it. Similarly, WordPress avoids the difficult problem of sanitising HTML and CSS while preserving a rich feature set by simply allowing all authors to post raw HTML.
If you are running MediaWiki in a CMS-like mode, with whitelist edit and account creation restricted, then I think it's fair to say that in terms of security, you're better off with MediaWiki 1.14.1 or later than you are with the latest version of WordPress.
However, the statistics presented by Qualys show that an alarming number of people are running versions of MediaWiki older than 1.14.1, which was the most recent fix for an XSS vulnerability exploitable without special privileges. There is certainly room for us to do better.
We have a new installer project in development, which we hope to release in 1.17. It includes a feature which encourages users to sign up for our release announcements mailing list. But maybe we need to do more. Should we take a leaf from WordPress's book, and nag administrators with a prominent notice when they are not using the latest version? Such a feature would require MediaWiki to "dial home", which is controversial in our developer community.
-- Tim Starling
We have a new installer project in development, which we hope to release in 1.17. It includes a feature which encourages users to sign up for our release announcements mailing list. But maybe we need to do more. Should we take a leaf from WordPress's book, and nag administrators with a prominent notice when they are not using the latest version? Such a feature would require MediaWiki to "dial home", which is controversial in our developer community.
If MediaWiki dials home, it should be configurable in such a way that it can be turned off. There are instances in use in places that would prefer not to their presence known. Enterprise use in general fits this category.
Respectfully,
Ryan Lane
Hey,
As many of you probably already know, my Google Summer of Code project [0] aims at providing this exact "dial home" functionality, for both MediaWiki core and extensions. (The project's goal is wider than this, but this is included as one of the main features.)
If MediaWiki dials home, it should be configurable in such a way that it
can be turned off. There are instances in use in places that would prefer not to their presence known. Enterprise use in general fits this category.
I totally agree here with Ryan. The idea is to have the "repository" where the version data is fetched is configurable, so it's possible to have other distributors then the WMF, and to turn of the feature entirely.
I'm currently looking into the repository and package fetching parts do allow for such "dialling home". MediaWiki.org seems the obvious choice to have the main repository on. There are many ways to then provide the needed data. Personally I think the best approach would be to install Semantic MediaWiki (yes, I used the s-word!) so data from the extension pages can be queried and shown in a distribution metadata format. That might require a small extension for some new spacial pages, and some scripts to collect other existing version data and put it into the wiki.
Is it possible to get SMW onto MW.org? This would also finally be a proof of concept of SMW on a WMF wiki, on which a lot of people have been waiting a long time now.
With only a little over 3 weeks left in GSoC, I have little doubt this project will not be finished, so any help in any form is definitely welcome.
[0] https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Deployment
Cheers
-- Jeroen De Dauw * http://blog.bn2vs.com * http://wiki.bn2vs.com Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! --
On 30 July 2010 06:35, Tim Starling tstarling@wikimedia.org wrote:
Cross-posted to http://techblog.wikimedia.org/2010/07/mediawiki-version-statistics/
Some kind people at Qualys have surveyed versions of open source web apps present on the web, including MediaWiki. Here is the relevant page from their presentation:
For the original see:
https://community.qualys.com/docs/DOC-1401
And the press release:
http://www.qualys.com/company/newsroom/newsreleases/usa/view/2010-07-28/
They make the point that 95% of MediaWiki installations have a "serious vulnerability", whereas only 4% of WordPress installations do. While WordPress's web-based upgrade utility certainly has a positive impact on security, I feel I should point out that what WordPress counts as a serious vulnerability does not align with MediaWiki's definition of the same term.
For instance, if a web-based user could execute arbitrary PHP code on the server, compromising all data and user accounts, we would count that as the most serious sort of vulnerability, and we would do an immediate release to fix it. We're proud of the fact that we haven't had any such vulnerability in a stable release since 1.5.3 (December 2005).
However in WordPress, they count this as a feature, and all administrators can do it. Similarly, WordPress avoids the difficult problem of sanitising HTML and CSS while preserving a rich feature set by simply allowing all authors to post raw HTML.
If you are running MediaWiki in a CMS-like mode, with whitelist edit and account creation restricted, then I think it's fair to say that in terms of security, you're better off with MediaWiki 1.14.1 or later than you are with the latest version of WordPress.
However, the statistics presented by Qualys show that an alarming number of people are running versions of MediaWiki older than 1.14.1, which was the most recent fix for an XSS vulnerability exploitable without special privileges. There is certainly room for us to do better.
We have a new installer project in development, which we hope to release in 1.17. It includes a feature which encourages users to sign up for our release announcements mailing list. But maybe we need to do more. Should we take a leaf from WordPress's book, and nag administrators with a prominent notice when they are not using the latest version? Such a feature would require MediaWiki to "dial home", which is controversial in our developer community.
-- Tim Starling
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 30.07.2010, 17:44 Jeroen wrote:
I'm currently looking into the repository and package fetching parts do allow for such "dialling home". MediaWiki.org seems the obvious choice to have the main repository on. There are many ways to then provide the needed data. Personally I think the best approach would be to install Semantic MediaWiki (yes, I used the s-word!) so data from the extension pages can be queried and shown in a distribution metadata format. That might require a small extension for some new spacial pages, and some scripts to collect other existing version data and put it into the wiki.
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core, it could be tweaked to supply version information for extensions, too.
Is it possible to get SMW onto MW.org? This would also finally be a proof of concept of SMW on a WMF wiki, on which a lot of people have been waiting a long time now.
Last time I heard about it, it had huge problems with security and code quality. Did anything change positively in that area over the last several months? If s***c developers believe that all Tim's concerns have been addressed, they should resubmit it for review.
On 30.07.2010, 18:20 /me wrote:
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core
And we forgot to update it when 1.16 was released, wheee! Added to release checklist now.
Hey,
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that
does server part of version checks for core, it could be tweaked to supply version information for extensions, too.
Although that suffices for determining if your version is up to date or not, it does not allow for actual update fetching and all the related stuff such as dependency resolution and simply browsing through available extensions in the repository, as you have with WordPress.
When for updates to the software, both core and extensions the system is
to phone home, it makes sense to integrate the LocalisationUpdate functionality and make it a more complete package.
Yes, that makes a lot of sense. I was not aware this functionality existed, so I'm definitely going to have a look at it now.
Cheers
-- Jeroen De Dauw * http://blog.bn2vs.com * http://wiki.bn2vs.com Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! --
On 30 July 2010 16:20, Max Semenik maxsem.wiki@gmail.com wrote:
On 30.07.2010, 17:44 Jeroen wrote:
I'm currently looking into the repository and package fetching parts do allow for such "dialling home". MediaWiki.org seems the obvious choice to have the main repository on. There are many ways to then provide the
needed
data. Personally I think the best approach would be to install Semantic MediaWiki (yes, I used the s-word!) so data from the extension pages can
be
queried and shown in a distribution metadata format. That might require a small extension for some new spacial pages, and some scripts to collect other existing version data and put it into the wiki.
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core, it could be tweaked to supply version information for extensions, too.
Is it possible to get SMW onto MW.org? This would also finally be a proof
of
concept of SMW on a WMF wiki, on which a lot of people have been waiting
a
long time now.
Last time I heard about it, it had huge problems with security and code quality. Did anything change positively in that area over the last several months? If s***c developers believe that all Tim's concerns have been addressed, they should resubmit it for review.
-- Best regards, Max Semenik ([[User:MaxSem]])
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sat, Jul 31, 2010 at 12:28 AM, Jeroen De Dauw jeroendedauw@gmail.com wrote:
Hey,
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that
does server part of version checks for core, it could be tweaked to supply version information for extensions, too.
Although that suffices for determining if your version is up to date or not, it does not allow for actual update fetching and all the related stuff such as dependency resolution and simply browsing through available extensions in the repository, as you have with WordPress.
When for updates to the software, both core and extensions the system is
to phone home, it makes sense to integrate the LocalisationUpdate functionality and make it a more complete package.
Yes, that makes a lot of sense. I was not aware this functionality existed, so I'm definitely going to have a look at it now.
I would highly unrecommended having the update feature in there, we already highly recommend against running as a db user with certain admins rights amongst other things, this feature will probably end up breaking more installs then updating (and yes I know wordpress has it, and I know how many times i've had to fix their botch updates), and not all installs would have the required modules that it needs (cURL/wGet comes to mind on IIS setups which some people use). Nor should we be assigning the update right or giving messages to the admin group by default, since most people that are admins are non technical and will just click any bright button that has messages along the lines of "omg update me now" without thinking if it will break something (Perhaps we should un-deprecate the developer usergroup for this).
-Peachey
On Fri, Jul 30, 2010 at 10:28 PM, K. Peachey p858snake@yahoo.com.au wrote:
I would highly unrecommended having the update feature in there, we already highly recommend against running as a db user with certain admins rights amongst other things, this feature will probably end up breaking more installs then updating (and yes I know wordpress has it, and I know how many times i've had to fix their botch updates), and not all installs would have the required modules that it needs (cURL/wGet comes to mind on IIS setups which some people use). Nor should we be assigning the update right or giving messages to the admin group by default, since most people that are admins are non technical and will just click any bright button that has messages along the lines of "omg update me now" without thinking if it will break something (Perhaps we should un-deprecate the developer usergroup for this).
If I'm interpreting this right, you're saying that upgrades can break stuff, so people should stick to versions with known security flaws. This is a defensible position in practice, but it doesn't justify making upgrades unnecessarily hard. It would be a good thing if typical admins could easily upgrade, without needing FTP access and so forth. If they choose not to, that's their choice, but if they want to upgrade, they should be able to do so easily.
On Fri, Jul 30, 2010 at 10:55 PM, K. Peachey p858snake@yahoo.com.au wrote:
You would also need to be vigilant and make sure people don't vandalize the information, For example if a spam version change got entered and broke someones installed.
Any kind of auto-update mechanism should be hardcoded to retrieve only from a specific Wikimedia URL and only over HTTPS, and the contents of that URL should only be changeable by sysadmins. Or at least the checksum should be retrieved that way.
On 1 August 2010 19:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
If I'm interpreting this right, you're saying that upgrades can break stuff, so people should stick to versions with known security flaws. This is a defensible position in practice, but it doesn't justify making upgrades unnecessarily hard.
I assume this is based on WordPress, where this happening was a bit of a problem for a while. These days it works pretty flawlessly. (3.0.1 is just out, I should probably install it.)
It would be a good thing if typical admins could easily upgrade, without needing FTP access and so forth. If they choose not to, that's their choice, but if they want to upgrade, they should be able to do so easily.
WordPress asks for the account's SFTP password, which I don't find feels like an undue imposition.
Basically, such a mechanism will be compared to WordPress every step of the way :-)
- d.
On Mon, Aug 2, 2010 at 4:14 AM, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
If I'm interpreting this right, you're saying that upgrades can break stuff, so people should stick to versions with known security flaws. This is a defensible position in practice, but it doesn't justify making upgrades unnecessarily hard. It would be a good thing if typical admins could easily upgrade, without needing FTP access and so forth. If they choose not to, that's their choice, but if they want to upgrade, they should be able to do so easily.
No I'm saying not to use a automated update version within a extension which for example has been shown to break things in other web based packages (Wordpress has apparently fixed it since the horrible times when i last attempted). What about the maintenance scripts people have to run? such as the updater, alot of people on shared hosting can't do those as it is without re-running the installer since they aren't allowed ssh access and ours aren't designed to be run from within the browser window.
Any kind of auto-update mechanism should be hardcoded to retrieve only from a specific Wikimedia URL and only over HTTPS, and the contents of that URL should only be changeable by sysadmins. Or at least the checksum should be retrieved that way.
So every-time someone that creates/modifies a extension wants to update its version number? which is why it was recommended to go wiki base, but that as well has it flaws.
On Sun, Aug 1, 2010 at 6:27 PM, K. Peachey p858snake@yahoo.com.au wrote:
No I'm saying not to use a automated update version within a extension which for example has been shown to break things in other web based packages (Wordpress has apparently fixed it since the horrible times when i last attempted).
I don't follow you.
What about the maintenance scripts people have to run? such as the updater, alot of people on shared hosting can't do those as it is without re-running the installer since they aren't allowed ssh access and ours aren't designed to be run from within the browser window.
Obviously, that would be changed.
So every-time someone that creates/modifies a extension wants to update its version number? which is why it was recommended to go wiki base, but that as well has it flaws.
I really don't think it would be a good idea to allow unvetted code to be downloaded and installed automatically. That's too easy for an attacker to abuse. But it's probably a reasonable tradeoff for some people. I don't know, I'm probably not going to be working on this anytime soon, so I don't make the decisions.
On 2 August 2010 00:29, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
On Sun, Aug 1, 2010 at 6:27 PM, K. Peachey p858snake@yahoo.com.au wrote:
So every-time someone that creates/modifies a extension wants to update its version number? which is why it was recommended to go wiki base, but that as well has it flaws.
I really don't think it would be a good idea to allow unvetted code to be downloaded and installed automatically. That's too easy for an attacker to abuse. But it's probably a reasonable tradeoff for some people. I don't know, I'm probably not going to be working on this anytime soon, so I don't make the decisions.
That's how WordPress does it - pretty much everyone runs WP with a metric buttload of extensions, so they're in the phoning home and one-click update cycle too. MediaWiki tends to get festooned with extensions as well, so users would probably like this in there.
A quick glance at the WP site docs didn't answer the question of how (or if) they secure this process. Asking would probably be good (whoever's doing the updater work).
- d.
A quick glance at the WP site docs didn't answer the question of how (or
if) they secure this process. Asking would probably be good (whoever's doing the updater work).
I've been looking at how WP works here and concluded this is basically not documented at all (from a developers perspective). On the WP IRC nobody seems to know anything about it, and my looking through the code itself has gotten me few insights into how updates and installation is secured. If anyone know more about this (esp what's up with the WP deployment repository), please contact me, this would be of great help for my GSoC project.
-- Jeroen De Dauw * http://blog.bn2vs.com * http://wiki.bn2vs.com Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! --
Hoi, The big problem with upgrading MediaWiki is the upgrading of extensions. It is not documented anywhere if extensions will work with a specific release of MediaWiki. Being able to install extensions is a great thing when you know upfront that the extensions you are interested in will actually work and not crash your installation. Thanks, GerardM
On 2 August 2010 02:14, Jeroen De Dauw jeroendedauw@gmail.com wrote:
A quick glance at the WP site docs didn't answer the question of how (or
if) they secure this process. Asking would probably be good (whoever's doing the updater work).
I've been looking at how WP works here and concluded this is basically not documented at all (from a developers perspective). On the WP IRC nobody seems to know anything about it, and my looking through the code itself has gotten me few insights into how updates and installation is secured. If anyone know more about this (esp what's up with the WP deployment repository), please contact me, this would be of great help for my GSoC project.
-- Jeroen De Dauw
Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! -- _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
2010/8/2 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, The big problem with upgrading MediaWiki is the upgrading of extensions. It is not documented anywhere if extensions will work with a specific release of MediaWiki. Being able to install extensions is a great thing when you know upfront that the extensions you are interested in will actually work and not crash your installation.
We branch and tag extensions along with versions of MediaWiki, so the code found at http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_15_5/extensions/ is supposed to work with MW 1.15.5. However, this assumes that the trunk version of each extension worked with the trunk version of MW at the time of the branch point, which is not always the case and is the responsibility of the extension maintainer. Versioned releases can also be downloaded through ExtensionDistributor at mediawiki.org, although ED breaks all the time and we've been getting lots of questions in #mediawiki from people whose wiki broke because they installed the trunk version of ParserFunctions (downloaded through ED) with MediaWiki 1.15.
Roan Kattouwv (Catrope)
We branch and tag extensions along with versions of MediaWiki, so the code found at http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_15_5/extensions/ is supposed to work with MW 1.15.5. However, this assumes that the trunk version of each extension worked with the trunk version of MW at the time of the branch point, which is not always the case and is the responsibility of the extension maintainer. Versioned releases can also be downloaded through ExtensionDistributor at mediawiki.org, although ED breaks all the time and we've been getting lots of questions in #mediawiki from people whose wiki broke because they installed the trunk version of ParserFunctions (downloaded through ED) with MediaWiki 1.15.
Though branches, tags, and the extension distributor exist, in reality none of them come even close to solving the problem.
The issue is, none of these things allow an extension author to properly match a version of an extension to a version of MediaWiki. The solution I take is to always have my extension useable in the trunk version, and backwards compatible to as many versions of MediaWiki as possible.
This is a problem we need to solve. This is one of the things that makes upgrading MediaWiki a crap shoot.
V/r,
Ryan Lane
Hoi, Pointing out that the maintainer of an extension is responsible for a proper functioning of an extensions tells me who to blame. For a tool intended to upgrade a MediaWiki environment this is hardly relevant particularly for the extensions that the WMF does not run itself there is no way of knowing if an upgrade will work well.
From a development point of view, it may be a particular developer who "is
to blame" but in reality it is the person maintaining a Wiki who is responsible, has to cope with the fall out and will be blamed.
When a tool is provided that helps with the upkeep of MediaWiki installations, it is irresponsible to consider this. There scenario that may work is to define "profiles" of MediaWiki software specific to a particular role. Such a profile is tested before it is enabled for upgrade.
Profiles that come to mind: Semantic MediaWiki, Wikipedia, WikiEducator ... Thanks, GerardM
On 2 August 2010 15:20, Roan Kattouw roan.kattouw@gmail.com wrote:
2010/8/2 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, The big problem with upgrading MediaWiki is the upgrading of extensions.
It
is not documented anywhere if extensions will work with a specific
release
of MediaWiki. Being able to install extensions is a great thing when you know upfront that the extensions you are interested in will actually work and not crash your installation.
We branch and tag extensions along with versions of MediaWiki, so the code found at http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_15_5/extensions/ is supposed to work with MW 1.15.5. However, this assumes that the trunk version of each extension worked with the trunk version of MW at the time of the branch point, which is not always the case and is the responsibility of the extension maintainer. Versioned releases can also be downloaded through ExtensionDistributor at mediawiki.org, although ED breaks all the time and we've been getting lots of questions in #mediawiki from people whose wiki broke because they installed the trunk version of ParserFunctions (downloaded through ED) with MediaWiki 1.15.
Roan Kattouwv (Catrope)
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I've been looking at how WP works here and concluded this is basically not documented at all (from a developers perspective). On the WP IRC nobody seems to know anything about it, and my looking through the code itself has gotten me few insights into how updates and installation is secured. If anyone know more about this (esp what's up with the WP deployment repository), please contact me, this would be of great help for my GSoC project.
Would it not be enough to hash all extensions on the distributor side, and to check the hash sum on the client side using https for the connection?
Respectfully,
Ryan Lane
Would it not be enough to hash all extensions on the distributor side, and to check the hash sum on the client side using https for the connection?
I guess this would suffice for ensuring integrity, but what about the other distribution meta-data? Where to get it from, how to manipulate it, and how to format it? Since WP and PEAR have systems that do (now) well at that, it makes a lot more sense to just copy what they do instead of trying to re-invent the wheel and make all the same mistakes they did.
-- Jeroen De Dauw * http://blog.bn2vs.com * http://wiki.bn2vs.com Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! --
Jeroen De Dauw wrote:
Would it not be enough to hash all extensions on the distributor side, and to check the hash sum on the client side using https for the connection?
I guess this would suffice for ensuring integrity, but what about the other distribution meta-data? Where to get it from, how to manipulate it, and how to format it? Since WP and PEAR have systems that do (now) well at that, it makes a lot more sense to just copy what they do instead of trying to re-invent the wheel and make all the same mistakes they did.
I would make the system based on our repository. So the MediaWiki update subsystem would have a base url like http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_16/
Then it would fetch a file called http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_14/VERSIONS containing all the extensions, their version and an optional url. If the version is newer than the installed one, it needs updating.
The presence of a special tag on the file for a url would send it to a new major version: http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_15/VERSIONS which would be recursed to the latest mediawiki branch so it can offer the really latest one for download (but note that it shouldn't use extensions from a newer branch without updating mediawiki).
If an extension doens't provide a url, treat it as extensions/<extension name> with a fallback to eg. extensions/archive/extensions/<extension name>.php (so we can safely redirect old extensions to new ones, etc.)
The VERSIONS file can be easily generated by an script reading their $wgExtensionCredits. Releasing a new extension would be increasing its version and recreating the VERSIONS file. Whereas if you the version isn't changed, new uploads would still get the improvements (yes, this scheme would also work for trunk).
To get an extension there, third party extension creators could request subversion access and place their extension there (the best option imho, since then they can be noticed when doing changes in core), get a developer to list their external url, or instruct the users to add their own repository to the repo list.
On Fri, Jul 30, 2010 at 7:20 AM, Max Semenik maxsem.wiki@gmail.com wrote:
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that does server part of version checks for core, it could be tweaked to supply version information for extensions, too.
It's being rewritten, FYI.
-Chad
Hey,
Can you provide some more information about that? Rewritten how?
Cheers
-- Jeroen De Dauw * http://blog.bn2vs.com * http://wiki.bn2vs.com Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69 66 65! --
On 30 July 2010 17:20, Chad innocentkiller@gmail.com wrote:
On Fri, Jul 30, 2010 at 7:20 AM, Max Semenik maxsem.wiki@gmail.com wrote:
There's already http://www.mediawiki.org/wiki/Extension:MWReleases that
does
server part of version checks for core, it could be tweaked to supply version information for extensions, too.
It's being rewritten, FYI.
-Chad
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
/me wrote:
Last time I heard about it, it had huge problems with security and code quality. Did anything change positively in that area over the last several months? If s***c developers believe that all Tim's concerns have been addressed, they should resubmit it for review.
Sorry, as Jeroen noted, only SemanticForms had these problems. My bad.
On Sat, Jul 31, 2010 at 1:52 AM, Max Semenik maxsem.wiki@gmail.com wrote:
/me wrote:
Last time I heard about it, it had huge problems with security and code quality. Did anything change positively in that area over the last several months? If s***c developers believe that all Tim's concerns have been addressed, they should resubmit it for review.
Sorry, as Jeroen noted, only SemanticForms had these problems. My bad.
-- Max Semenik ([[User:MaxSem]])
That was the only one checked wasn't it? Everything would need to be security checked again before it could be deployed.
On Fri, Jul 30, 2010 at 11:44 PM, Jeroen De Dauw jeroendedauw@gmail.com wrote:
..snip.. I totally agree here with Ryan. The idea is to have the "repository" where the version data is fetched is configurable, so it's possible to have other distributors then the WMF, and to turn of the feature entirely.
I'm currently looking into the repository and package fetching parts do allow for such "dialling home". MediaWiki.org seems the obvious choice to have the main repository on. There are many ways to then provide the needed data. Personally I think the best approach would be to install Semantic MediaWiki (yes, I used the s-word!) so data from the extension pages can be queried and shown in a distribution metadata format. That might require a small extension for some new spacial pages, and some scripts to collect other existing version data and put it into the wiki.
Is it possible to get SMW onto MW.org? This would also finally be a proof of concept of SMW on a WMF wiki, on which a lot of people have been waiting a long time now.
With only a little over 3 weeks left in GSoC, I have little doubt this project will not be finished, so any help in any form is definitely welcome.
[0] https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Deployment
I don't think on-wiki would be the best way for this, espically for the extensions within our SVN, because you would have to list the revision that it needs against the version number and the version of the extension. and then for the ones we don't have in our SVN you would need to store their download format (http/git/svn etc) and location as well.
You would also need to be vigilant and make sure people don't vandalize the information, For example if a spam version change got entered and broke someones installed.
-Peachey
Hello Tim,
I'd like to contribute a somewhat different (although I suppose common) perspective to this discussion. I help run a free-for-the-community shared webhosting service, and one of the services we have is "automatic installation" of common web applications for people who don't know very much about setting up or deploying applications. Wordpress and MediaWiki are among our most popular installations.
Since it's not reasonable to assume someone who can click a button to setup an application has the know-how to upgrade it manually, any installation that we autoinstall also comes with an upgrade promise: when new versions of the application come out, we reserve the right to automatically upgrade the application for you. (Since we allow users to patch their installs, there are some, ah, technical difficulties associated with this.)
We've noticed several things:
- When Wordpress 3.0 came out, we received several support tickets asking us when we would be pushing an upgrade, and asked us if anything bad would happen if they went ahead and upgraded their install themselves. We have /never/ had this happen for MediaWiki.
- Our spread of versions is quite interesting:
wordpress 649 installs 2.0.2 * 5 + 2.0.4 7 + 2.0.11 4 + 2.1.3 1 + 2.3 2 + 2.3.2 * 1 + 2.3.3 * 29 ++ 2.5.1 * 17 ++ 2.6 1 + 2.6.2 2 + 2.6.3 2 + 2.7 2 + 2.7.1 * 15 + 2.8 * 8 + 2.8.1 1 + 2.8.2 * 2 + 2.8.4 | 6 + 2.8.5 | 3 + 2.9 | 2 + 2.9.1 | 4 + 2.9.2 | 74 +++++ 3.0 | 461 ++++++++++++++++++++++++++++++
mediawiki 1017 installs 1.5.8 * 118 ++++++ 1.11.0 * 125 ++++++ 1.14.0 * 6 + 1.15.0 * 6 + 1.15.1 | 65 +++ 1.15.2 | 15 + 1.15.3 | 18 + 1.15.4 | 664 ++++++++++++++++++++++++++++++
Applications that are on older versions we attempted to upgrade, but had to bail out because there were nontrivial merge conflicts (that is, the user had edited some core files and the upgrade would have obliterated those changes)--there are some exceptions but that is the primary mode by which upgrades failed.
The Star means that we offered installation of that version. Our upgrade process was spotty until about a year and a half ago, when we started really making sure we tracked upstream versions closely.
There are certainly some conclusions to be made here, including "When people patch MediaWiki, they patch it in a way that's really hard to upgrade" and "People don't upgrade MediaWiki by themselves" (note that Wordpress has a spread of versions all over the place, whereas every MediaWiki was from a version we supported."
Let me know if you have any questions; I'd be happy to run other queries on our setup.
Cheers, Edward
Edward Z. Yang wrote:
We've noticed several things:
- When Wordpress 3.0 came out, we received several support tickets asking us when we would be pushing an upgrade, and asked us if anything bad would happen if they went ahead and upgraded their install themselves. We have /never/ had this happen for MediaWiki.
I'm not sure that's comparable. If WordPress complains for being an old version, unsavy users will want it to be upgraded for them. Whereas if they watched the relevant mailing list they probably have the required skills to manually update. (Since they chose a 'managed' mediawiki, it's not that they would be required to do it anyway)
mediawiki 1017 installs 1.5.8 * 118 ++++++ 1.11.0 * 125 ++++++ 1.14.0 * 6 + 1.15.0 * 6 + 1.15.1 | 65 +++ 1.15.2 | 15 + 1.15.3 | 18 + 1.15.4 | 664 ++++++++++++++++++++++++++++++
Applications that are on older versions we attempted to upgrade, but had to bail out because there were nontrivial merge conflicts (that is, the user had edited some core files and the upgrade would have obliterated those changes)--there are some exceptions but that is the primary mode by which upgrades failed. The Star means that we offered installation of that version. Our upgrade process was spotty until about a year and a half ago, when we started really making sure we tracked upstream versions closely.
Does that mean that they chose that version despite being outdated? I wonder if all those 1.5.8 installs are due to thinking 1.5 is greater than 1.15
There are certainly some conclusions to be made here, including "When people patch MediaWiki, they patch it in a way that's really hard to upgrade" and "People don't upgrade MediaWiki by themselves" (note that Wordpress has a spread of versions all over the place, whereas every MediaWiki was from a version we supported."
There's probably some interesting knowledge on looking how they patched it, but I don't know how to easily extract it.
Excerpts from Platonides's message of Sun Aug 01 17:39:19 -0400 2010:
I'm not sure that's comparable. If WordPress complains for being an old version, unsavy users will want it to be upgraded for them. Whereas if they watched the relevant mailing list they probably have the required skills to manually update. (Since they chose a 'managed' mediawiki, it's not that they would be required to do it anyway)
Right, so it's an interesting combination of factors.
* Does the application tell the user that their version is out of date? * Does the application let an unsavvy user upgrade the application with a single click?
Our experience suggests that if the answers to these questions are yes, unsavvy users will definitely exercise the feature.
Does that mean that they chose that version despite being outdated? I wonder if all those 1.5.8 installs are due to thinking 1.5 is greater than 1.15
No, that just means they installed 1.5.8 back when it was still the default, and then we didn't manage to upgrade them.
There's probably some interesting knowledge on looking how they patched it, but I don't know how to easily extract it.
A good starting point would probablyb e "most edited files".
Cheers, Edward
Edward Z. Yang wrote:
There's probably some interesting knowledge on looking how they patched it, but I don't know how to easily extract it.
A good starting point would probablyb e "most edited files".
Cheers, Edward
I'm open for any data :) My guess is that the most edited files are the skins. Then perhaps message files (instead of using the MediaWiki namespace) and maybe the Linker/Parser. It's likely our fault, too. We should perform a review the advices at www.mediawiki.org and archive the outdated ones which require patching.
On Fri, Jul 30, 2010 at 06:35, Tim Starling tstarling@wikimedia.org wrote:
However, the statistics presented by Qualys show that an alarming number of people are running versions of MediaWiki older than 1.14.1, which was the most recent fix for an XSS vulnerability exploitable without special privileges. There is certainly room for us to do better.
I haven't read all the documents, but have these researchers taken into account backported fixes?
My gut feeling is that the "preference" for 1.12 is simply due to its inclusion in Debian stable [1]. The maintainer seems to be actively backporting security fixes [2], so while I agree that these versions may enjoy less community support, they should not be considered broken on the basis of the version number alone.
This, of course, unless it is certain that some vulnerabilities are still present in the Debian version. If you are aware of the existence of such a problem, I would recommend you contact security@debian.org. Otherwise, the situation might not be as dangerous as it seems.
On the topic of facilitating upgrades: perhaps we should emphasize the option to install and upgrade using SVN, which is probably very convenient for users that are comfortable with the command line. Moodle has this in the official documentation and I find it very useful [3]. SVN could also be handy as the backend for a user-friendly upgrade procedure, as it already deals with local modifications and such.
[1] http://packages.debian.org/search?keywords=mediawiki [2] http://packages.debian.org/changelogs/pool/main/m/mediawiki/mediawiki_1.12.0... [3] http://docs.moodle.org/en/Upgrading#Using_CVS
On 02.08.2010, 18:01 Jacopo wrote:
My gut feeling is that the "preference" for 1.12 is simply due to its inclusion in Debian stable [1]. The maintainer seems to be actively backporting security fixes [2], so while I agree that these versions may enjoy less community support, they should not be considered broken on the basis of the version number alone.
This, of course, unless it is certain that some vulnerabilities are still present in the Debian version. If you are aware of the existence of such a problem, I would recommend you contact security@debian.org. Otherwise, the situation might not be as dangerous as it seems.
They haven't backported security fixes from 1.15.4 and 1.15.5 yet, which are seveal months old (OMG disclosure!) And who knows what other problems (including security flaws) may still be there, as "stabe" versions usually get much less attention and testing.
I haven't read all the documents, but have these researchers taken into account backported fixes?
My gut feeling is that the "preference" for 1.12 is simply due to its inclusion in Debian stable [1]. The maintainer seems to be actively backporting security fixes [2], so while I agree that these versions may enjoy less community support, they should not be considered broken on the basis of the version number alone.
This, of course, unless it is certain that some vulnerabilities are still present in the Debian version. If you are aware of the existence of such a problem, I would recommend you contact security@debian.org. Otherwise, the situation might not be as dangerous as it seems.
On the topic of facilitating upgrades: perhaps we should emphasize the option to install and upgrade using SVN, which is probably very convenient for users that are comfortable with the command line. Moodle has this in the official documentation and I find it very useful [3]. SVN could also be handy as the backend for a user-friendly upgrade procedure, as it already deals with local modifications and such.
As someone who has had their code patched by the debian team, I'd like to take the time to bitch about this.
Firstly, their patches are often incorrect. Secondly, though they've patched my LDAP extension a number of times, I have *never* received a bug report or a patch from them for something they've fixed. It is extremely annoying to see a fix has been around that I could have used months before someone reports a problem to me. Beyond anything else this bothers me the most. They really need to be better community members in regards to this. Lastly, packaging and maintaining such an old version of MediaWiki does a disservice to us, and their users. We don't support versions of MediaWiki that old. I understand that Debian backports security fixes for MediaWiki, but they don't backport new features, and don't backport all bug fixes. Additionally, Debian doesn't backport security fixes for all extensions. Not all extension developers bother maintaining backwards compatibility, and the only possible way to get security fixes is to upgrade MediaWiki and the extension.
Please Debian, keep your version of MediaWiki up to date at least to the oldest stable release, and please send your fixes upstream when you find unfixed bugs.
Respectfully,
Ryan Lane
On Mon, Aug 2, 2010 at 10:16 AM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
Please Debian, keep your version of MediaWiki up to date at least to the oldest stable release, and please send your fixes upstream when you find unfixed bugs.
I am not a Debian developer, and I agree that sending fixes upstream is good. But surely you're aware that the whole point of "Debian stable" is that it does ***not*** change to newer versions of programs after release, apart from security fixes? Debian is well known for taking the word "stable" seriously (e.g. [1]) and it's a reason people choose them.
- Carl
[1]: http://www.debian.org/doc/manuals/debian-faq/ch-getting.en.html#s-updatestab...
Excerpts from Carl (CBM)'s message of Mon Aug 02 19:06:42 -0400 2010:
I am not a Debian developer, and I agree that sending fixes upstream is good. But surely you're aware that the whole point of "Debian stable" is that it does ***not*** change to newer versions of programs after release, apart from security fixes? Debian is well known for taking the word "stable" seriously (e.g. [1]) and it's a reason people choose them.
Ryan's complaint is:
1. Distributors frequently get backported patches wrong (usually due to a lack of expertise or manpower), and
2. Distributors roll patches without telling upstream developers who would happily accept them into the mainline.
However, upstream developers are often guilty of ignoring a distribution's needs, so it goes both ways.
Cheers, Edward
On 3 August 2010 00:17, Edward Z. Yang ezyang@mit.edu wrote:
2. Distributors roll patches without telling upstream developers who would happily accept them into the mainline.
Has anyone reported the following as Debian bugs?
* Package maintainer not sending patches back upstream * Package maintainer not visible and active in MediaWiki development * Package maintainer not visible and active in MediaWiki community support, leaving supporting his packages to the upstream
- d.
David Gerard wrote:
On 3 August 2010 00:17, Edward Z. Yang ezyang@mit.edu wrote:
- Distributors roll patches without telling upstream developers who would happily accept them into the mainline.
Has anyone reported the following as Debian bugs?
- Package maintainer not sending patches back upstream
- Package maintainer not visible and active in MediaWiki development
- Package maintainer not visible and active in MediaWiki community
support, leaving supporting his packages to the upstream
- d.
In fact, one of their patches was sent upstream a couple of months ago and we didn't react to it. https://bugzilla.wikimedia.org/show_bug.cgi?id=24132
It's a documentation patch, fine as it is, although i'd prefer to fix that by moving dumpBackup to the new Maintenance style.
Other patches are not so fine... wfSuppressWarnings(); - session_start(); + @session_start(); wfRestoreWarnings();
Sure, it was for FusionForge package, but still what error would session_start produce that is not trapped? (I can only make an E_NOTICE or E_WARNING...)
On Fri, Aug 6, 2010 at 1:08 AM, Platonides Platonides@gmail.com wrote:
Other patches are not so fine... wfSuppressWarnings();
- session_start();
- @session_start();
wfRestoreWarnings();
Sure, it was for FusionForge package, but still what error would session_start produce that is not trapped? (I can only make an E_NOTICE or E_WARNING...)
It's enough to break later header(); with "Can't send header information - headers already sent by in foo".
Marco
Marco Schuster wrote:
On Fri, Aug 6, 2010 at 1:08 AM, Platonides Platonides@gmail.com wrote:
Other patches are not so fine... wfSuppressWarnings();
session_start();
@session_start(); wfRestoreWarnings();
Sure, it was for FusionForge package, but still what error would session_start produce that is not trapped? (I can only make an E_NOTICE or E_WARNING...)
It's enough to break later header(); with "Can't send header information - headers already sent by in foo".
Marco
The point is, both of them are suppressed by wfSuppressWarnings... So what can the @ be suppressed that wfSuppressWarnings may have missed?
Unless the decision to add an @ there predates out own warning supppresion.
On Fri, Aug 6, 2010 at 5:16 PM, Platonides Platonides@gmail.com wrote:
The point is, both of them are suppressed by wfSuppressWarnings... So what can the @ be suppressed that wfSuppressWarnings may have missed?
Unless the decision to add an @ there predates out own warning supppresion.
@ must not be used, ever. Any uses you see are wrong and should be refactored. The vast majority of offenders are lazy array index accessing:
BAD PROGRAMMER: $foo = @$bar['key'];
GOOD PROGRAMMER: $foo = isset( $bar['key'] ) ? $bar['key'] : null;
It's a few extra keystrokes, but you're saved from using the @ to suppress *all* errors that could be occurring there--not just the possible index-does-not-exist. Better example:
$foo = @$this->obj->bar['key'];
If you use @, you're also suppressing that $obj might not be an object, or that $bar doesn't have a key 'key'. Another common pitfall might be:
@$obj->method();
What happens if method() changes and nobody checked the callers? You're now possibly suppressing errors you never meant to suppress.
What sort of errors are ok to suppress, and how do I do it? The most common (and annoying) errors that need suppression are file operation things (fopen, file_get_contents, etc), usually due to invalid permissions. If you want to hide errors here, it's ok to do something like:
wfSuppressWarnings(); $t = file_get_contents( 'somefile' ); wfRestoreWarnings();
This will properly suppress and restore the warnings for you. Handy trick, if you're making directories, wfMkdirParents() handles all of this for you, so you can just check the boolean return without having to worry about possible errors.
-Chad
I am not a Debian developer, and I agree that sending fixes upstream is good. But surely you're aware that the whole point of "Debian stable" is that it does ***not*** change to newer versions of programs after release, apart from security fixes? Debian is well known for taking the word "stable" seriously (e.g. [1]) and it's a reason people choose them.
Are they also backporting security fixes for all extensions as well? If not, then they are doing a serious disservice to their users. Some extensions have had some *really* serious vulnerabilities. We generally mark these as such when we find them, but the warnings go away when the vulnerabilities are fixed. Unfortunately for those using old versions of MediaWiki, they may never know the extension was vulnerable for the version they are downloading. Maybe we should be more vigilant about how we mark things, but it is difficult to manage this for all extensions, especially since they aren't all code reviewed.
If Debian doesn't feel they should keep supported versions in their repos, maybe they shouldn't distribute MediaWiki.
Respectfully,
Ryan Lane
On Mon, Aug 2, 2010 at 7:35 PM, Ryan Lane rlane32@gmail.com wrote:
Are they also backporting security fixes for all extensions as well?
I would assume that Debian, ideally, applies security patches for extensions they distribute themselves. Programs a user has installed outside the Debian system are always going to be the responsibility of the user.
Of course, if Debian upgraded their "stable" version of Mediawiki to the newest version, and my production server was running a custom extension that only worked with the previous version of Mediawiki that Debian called "stable", I'd be pissed.
If Debian doesn't feel they should keep supported versions in their repos, maybe they shouldn't distribute MediaWiki.
That is, seriously, an absurd attitude for a Mediawiki Developer to have. It reflects a fundamental misunderstanding of the meaning of Debian's "stable version" system.
Note that Debian stood up to Mozilla corp. when Mozilla attempted to stop Debian uploading security patches to stable versions [1]. Surely Mediawiki would have much less persuasive power telling them to stop.
I'm exiting of this discussion at this point. I've made the point I wanted to make, compelling or not, and I'm afraid I'm drifting off topic.
- Carl
[1]: http://en.wikipedia.org/wiki/Mozilla_Corporation_software_rebranded_by_the_D...
If Debian doesn't feel they should keep supported versions in their repos, maybe they shouldn't distribute MediaWiki.
That is, seriously, an absurd attitude for a Mediawiki Developer to have. It reflects a fundamental misunderstanding of the meaning of Debian's "stable version" system.
Note that Debian stood up to Mozilla corp. when Mozilla attempted to stop Debian uploading security patches to stable versions [1]. Surely Mediawiki would have much less persuasive power telling them to stop.
I'm exiting of this discussion at this point. I've made the point I wanted to make, compelling or not, and I'm afraid I'm drifting off topic.
I'm not saying we should tell them to stop. They can distribute whatever they want. I'm simply saying their "stable" version is likely doing more harm than good, and that just because they can distribute something in their somewhat insane way, doesn't mean they should.
Respectfully,
Ryan Lane
On Mon, Aug 2, 2010 at 7:06 PM, Carl (CBM) cbm.wikipedia@gmail.com wrote:
I am not a Debian developer, and I agree that sending fixes upstream is good. But surely you're aware that the whole point of "Debian stable" is that it does ***not*** change to newer versions of programs after release, apart from security fixes?
Which means it doesn't get all security fixes either, because nobody announces vulnerabilities or publishes patches for unsupported MediaWiki versions. If a bug occurred only in an old version, it won't be announced. Distributions that try to pretend they can support software for years past the time the vendor stopped supporting it are probably crazy, but then, they're no more crazy than the users who ask for that behavior, and I don't think we're likely to change them.
From #wikimedia-tech a couple years ago:
080511 15:35:42 <Simetrical> mark, why Ubuntu? 080511 15:37:03 <mark> becuase that's what we use for all new servers? :) 080511 15:39:18 <Simetrical> mark, well, yes. What made you decide on Ubuntu? 080511 15:39:28 <mark> it's debian but with predictable release cycles
As for not upstreaming patches, probably the best bet there is for us to give up and just watch the major distro bug trackers ourselves, because I doubt we're going to get the distributors ever reporting anything to us consistently.
On Mon, Aug 2, 2010 at 7:17 PM, Edward Z. Yang ezyang@mit.edu wrote:
However, upstream developers are often guilty of ignoring a distribution's needs, so it goes both ways.
I spoke with the Fedora maintainer of MediaWiki some time ago pretty extensively about his hacks to MediaWiki, particularly the way he moved all files around without understanding what he was doing and completely broke the software. (Reportedly to the point that styles and scripts didn't work because he moved them out of the web root. Really. The Fedora wiki didn't use the Fedora MediaWiki package because it was so broken.) I suggested in some detail a better way to fix things, and offered to review any patches he wanted to submit upstream. He never submitted any. Oh well.
I'm thankful that the Debian MediaWiki package at least *works*. Not that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
On 3 August 2010 16:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
I'm thankful that the Debian MediaWiki package at least *works*. Not that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
_<
IT'S A TARBALL!
A TARBALL OF SOURCE CODE!
THAT YOU INTERPRET!
_<
- d.
On 3 August 2010 18:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
I'm thankful that the Debian MediaWiki package at least *works*. Not that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
That just creates more problems: * bad quality distro packages * bad quality our own packages (while we know MediaWiki, we are not experts in packaging) * lots of confusion
I don't see any other way out but to reach to the packagers and get their packages fixed. What we can do is to communicate this to our users and try to communicate more with the packagers. We already do the first in our IRC channel (telling users we can't support distro packages, and that they should just download the tarball), but there are lots of place where we don't do that yet.
In short: education and communication, not trying to do their job.
-Niklas
On Tue, Aug 3, 2010 at 12:45 PM, Niklas Laxström niklas.laxstrom@gmail.com wrote:
I don't see any other way out but to reach to the packagers and get their packages fixed. What we can do is to communicate this to our users and try to communicate more with the packagers.
I tried that with Fedora. You can read about it here:
https://bugzilla.redhat.com/show_bug.cgi?id=484855 https://fedorahosted.org/fesco/ticket/225
Result: nothing.
On 3 August 2010 18:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
I'm thankful that the Debian MediaWiki package at least
*works*. Not
that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
That just creates more problems:
- bad quality distro packages
- bad quality our own packages (while we know MediaWiki, we are not
experts in packaging)
- lots of confusion
I've packaged hundreds of RPMs. It isn't difficult, and you don't need to be an expert. It is easy enough to package the MediaWiki software. The real problem comes with upgrades. How does the package handle this? Do we ignore the actual maintanence/update.php portion? Do we run it? How do we handle extensions? Package them too? Do we make a repo for all of this? How are the extensions handled on upgrade?
Having MediaWiki in a package really doesn't make much sense, unless we put a lot of effort into making it work this way.
I don't see any other way out but to reach to the packagers and get their packages fixed. What we can do is to communicate this to our users and try to communicate more with the packagers. We already do the first in our IRC channel (telling users we can't support distro packages, and that they should just download the tarball), but there are lots of place where we don't do that yet.
In short: education and communication, not trying to do their job.
I think we should be doing education, but not for the package maintainers. We should try harder to inform our users that they shouldn't used distro maintained packages, and we should explain why.
Respectfully,
Ryan Lane
Lane, Ryan wrote:
On 3 August 2010 18:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
I'm thankful that the Debian MediaWiki package at least
*works*. Not
that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
That just creates more problems:
- bad quality distro packages
- bad quality our own packages (while we know MediaWiki, we are not
experts in packaging)
- lots of confusion
I've packaged hundreds of RPMs. It isn't difficult, and you don't need to be an expert. It is easy enough to package the MediaWiki software. The real problem comes with upgrades. How does the package handle this? Do we ignore the actual maintanence/update.php portion? Do we run it? How do we handle extensions? Package them too? Do we make a repo for all of this? How are the extensions handled on upgrade?
I had some plans for adding the needed hooks to the installer so that it could be run by a package manager in a more FHS way, store LocalSettings inside /etc, automatically run upgrade.php, etc.
Then since the new installer changed how LocalSettings is provided to the user, I didn't thought further on how to adapt it.
On Tue, Aug 3, 2010 at 1:38 PM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
I think we should be doing education, but not for the package maintainers. We should try harder to inform our users that they shouldn't used distro maintained packages, and we should explain why.
I'm not sure I buy this. Why is MediaWiki so special that it can't exist inside of a package? Is MediaWiki such a special piece of software that it's impossible to build a good package?
I think user education is going to be even more futile than package maintainer education. The allure of running a system like Debian or Fedora is the ability to have pre-vetted software running in a configuration designed to work as part of a system. I'm not here to start a debate about whether they are successful in achieving that, but it's clearly a popular enough notion that an education effort to counter that probably won't have much of an impact with anyone beyond the Slackware community.
+1 for package maintainer education (as frustrating and unproductive as it might be thusfar)
Rob
On Wed, Aug 4, 2010 at 11:03 AM, Rob Lanphier robla@robla.net wrote:
On Tue, Aug 3, 2010 at 1:38 PM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
I think we should be doing education, but not for the package maintainers. We should try harder to inform our users that they shouldn't used distro maintained packages, and we should explain why.
I'm not sure I buy this. Why is MediaWiki so special that it can't exist inside of a package? Is MediaWiki such a special piece of software that it's impossible to build a good package?
It Can, we just want a working package, And until the providers provide this, there will be recommendations against using the packages. Someone, That for example, installs a package that has broken skins right from the get go are going to have bad impressions on MediaWiki, it won't be until they go digging to find out that they need to manually set a alias for their webserver to find it, And it won't be until someone mentions otherwise that its actually the packages fault its broken.
I'm not sure I buy this. Why is MediaWiki so special that it can't exist inside of a package? Is MediaWiki such a special piece of software that it's impossible to build a good package?
It's "special". It isn't necessarily the fault of the distro or the package maintainer for the quality of the packages. It is our fault. Upgrading is unreliable for a number of reasons. It is definitely unreliable enough that I wouldn't trust a package to do it for me, and I can't reasonably recommend it for anyone else either.
I think user education is going to be even more futile than package maintainer education. The allure of running a system like Debian or Fedora is the ability to have pre-vetted software running in a configuration designed to work as part of a system. I'm not here to start a debate about whether they are successful in achieving that, but it's clearly a popular enough notion that an education effort to counter that probably won't have much of an impact with anyone beyond the Slackware community.
+1 for package maintainer education (as frustrating and unproductive as it might be thusfar)
I think it would be better if we provided the packages. If we fix our upgrade issues, I'll be more than happy to write rpms and debs.
Respectfully,
Ryan Lane
On Wed, Aug 4, 2010 at 5:48 AM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
Is MediaWiki such a special piece of software that it's impossible to build a good package?
It's "special". It isn't necessarily the fault of the distro or the package maintainer for the quality of the packages. It is our fault. Upgrading is unreliable for a number of reasons. It is definitely unreliable enough that I wouldn't trust a package to do it for me, and I can't reasonably recommend it for anyone else either.
Fair enough. Is it safe to assume that the new installer work should make proper packaging more viable?
I think it would be better if we provided the packages. If we fix our upgrade issues, I'll be more than happy to write rpms and debs.
I think this would be fantastic, so defining what bar we need to clear would be seems like a worthwhile exercise. Even if distro makers still go off and create their own derivatives, having a good reference implementation would be wonderful.
I think you'd be hard pressed to find a project that doesn't complain about what packagers do (assuming they don't do their own packaging, and even then...)
Rob
On Tue, Aug 3, 2010 at 9:03 PM, Rob Lanphier robla@robla.net wrote:
+1 for package maintainer education (as frustrating and unproductive as it might be thusfar)
I think "education" isn't a good term for what needs to happen here. More like "doing the work for them". Package maintainers might maintain lots of packages, and certainly don't know much about any of them. Some MW developer needs to look at the popular distros, read up on their packaging standards, and make a MediaWiki package that a) meets the standards, but also b) actually works and is supported upstream. Keep any packaging tools in our own SVN where that makes sense, so the distributor can ship software with absolutely no changes if they like. And give them some contacts they can forward any patches to, so that hopefully that don't feel the need to accept patches that haven't been reviewed upstream.
As I remarked on IRC, having packages as an official installation mechanism has nice benefits for people who don't get their code from distros, too. We could set up our own official repository. This would handle updates automatically, but it would do more than that too. Our current installer is crippled in all sorts of ways because it has to run as the web user. An installer that runs as root could do all sorts of handy things, particularly where permissions are an issue:
* Enable uploads by default * Hide deleted images properly * Enable $wgCacheDirectory by default * Enable math by default * Enable clamav by default (maybe :) ) * Enable Djvu and SVG support by default * Enable ImageMagick by default * Set up cron job to run jobs by default instead of hacky running on page view
We'd likely want to provide packages for all the extensions in SVN too, somehow. This is complicated by the fact that almost none of the extensions are actually released independently. Maybe that should change somehow.
On Wed, Aug 4, 2010 at 8:48 AM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
It's "special". It isn't necessarily the fault of the distro or the package maintainer for the quality of the packages. It is our fault. Upgrading is unreliable for a number of reasons. It is definitely unreliable enough that I wouldn't trust a package to do it for me, and I can't reasonably recommend it for anyone else either.
Upgrading is perfectly reliable in my experience, as long as all your extensions are reliable, and you upgrade them too. If people do file edits, or they install weird extensions, then of course upgrades might break stuff. But if you're using only well-supported extensions, there should be no major problems in most cases. If there are, well, that's what distributions have testing for!
On Wed, Aug 4, 2010 at 7:04 PM, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
We'd likely want to provide packages for all the extensions in SVN too, somehow. This is complicated by the fact that almost none of the extensions are actually released independently. Maybe that should change somehow.
Combine this with a WM-operated deb/rpm repo and make mediawiki-foobar packages for all the extensions. Using version requirements in the packages you can even ensure that only compatible stuff gets installed.
Marco
Hoi, In addition to all that it makes sense to have LocalisationUpdate installed and configured. It ensures that people who opt for another language then English have the latest available localisations for the messages on their wiki. Thanks, GerardM
On 4 August 2010 19:04, Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
wrote:
On Tue, Aug 3, 2010 at 9:03 PM, Rob Lanphier robla@robla.net wrote:
+1 for package maintainer education (as frustrating and unproductive as it might be thusfar)
I think "education" isn't a good term for what needs to happen here. More like "doing the work for them". Package maintainers might maintain lots of packages, and certainly don't know much about any of them. Some MW developer needs to look at the popular distros, read up on their packaging standards, and make a MediaWiki package that a) meets the standards, but also b) actually works and is supported upstream. Keep any packaging tools in our own SVN where that makes sense, so the distributor can ship software with absolutely no changes if they like. And give them some contacts they can forward any patches to, so that hopefully that don't feel the need to accept patches that haven't been reviewed upstream.
As I remarked on IRC, having packages as an official installation mechanism has nice benefits for people who don't get their code from distros, too. We could set up our own official repository. This would handle updates automatically, but it would do more than that too. Our current installer is crippled in all sorts of ways because it has to run as the web user. An installer that runs as root could do all sorts of handy things, particularly where permissions are an issue:
- Enable uploads by default
- Hide deleted images properly
- Enable $wgCacheDirectory by default
- Enable math by default
- Enable clamav by default (maybe :) )
- Enable Djvu and SVG support by default
- Enable ImageMagick by default
- Set up cron job to run jobs by default instead of hacky running on page
view
We'd likely want to provide packages for all the extensions in SVN too, somehow. This is complicated by the fact that almost none of the extensions are actually released independently. Maybe that should change somehow.
On Wed, Aug 4, 2010 at 8:48 AM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
It's "special". It isn't necessarily the fault of the distro or the
package
maintainer for the quality of the packages. It is our fault. Upgrading is unreliable for a number of reasons. It is definitely unreliable enough
that
I wouldn't trust a package to do it for me, and I can't reasonably
recommend
it for anyone else either.
Upgrading is perfectly reliable in my experience, as long as all your extensions are reliable, and you upgrade them too. If people do file edits, or they install weird extensions, then of course upgrades might break stuff. But if you're using only well-supported extensions, there should be no major problems in most cases. If there are, well, that's what distributions have testing for!
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Fuck you asshole go use other wiki
On Fri, Aug 6, 2010 at 7:49 AM, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, In addition to all that it makes sense to have LocalisationUpdate installed and configured. It ensures that people who opt for another language then English have the latest available localisations for the messages on their wiki. Thanks, GerardM
On 4 August 2010 19:04, Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
wrote:
On Tue, Aug 3, 2010 at 9:03 PM, Rob Lanphier robla@robla.net wrote:
+1 for package maintainer education (as frustrating and unproductive as it might be thusfar)
I think "education" isn't a good term for what needs to happen here. More like "doing the work for them". Package maintainers might maintain lots of packages, and certainly don't know much about any of them. Some MW developer needs to look at the popular distros, read up on their packaging standards, and make a MediaWiki package that a) meets the standards, but also b) actually works and is supported upstream. Keep any packaging tools in our own SVN where that makes sense, so the distributor can ship software with absolutely no changes if they like. And give them some contacts they can forward any patches to, so that hopefully that don't feel the need to accept patches that haven't been reviewed upstream.
As I remarked on IRC, having packages as an official installation mechanism has nice benefits for people who don't get their code from distros, too. We could set up our own official repository. This would handle updates automatically, but it would do more than that too. Our current installer is crippled in all sorts of ways because it has to run as the web user. An installer that runs as root could do all sorts of handy things, particularly where permissions are an issue:
- Enable uploads by default
- Hide deleted images properly
- Enable $wgCacheDirectory by default
- Enable math by default
- Enable clamav by default (maybe :) )
- Enable Djvu and SVG support by default
- Enable ImageMagick by default
- Set up cron job to run jobs by default instead of hacky running on page
view
We'd likely want to provide packages for all the extensions in SVN too, somehow. This is complicated by the fact that almost none of the extensions are actually released independently. Maybe that should change somehow.
On Wed, Aug 4, 2010 at 8:48 AM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
It's "special". It isn't necessarily the fault of the distro or the
package
maintainer for the quality of the packages. It is our fault. Upgrading is unreliable for a number of reasons. It is definitely unreliable enough
that
I wouldn't trust a package to do it for me, and I can't reasonably
recommend
it for anyone else either.
Upgrading is perfectly reliable in my experience, as long as all your extensions are reliable, and you upgrade them too. If people do file edits, or they install weird extensions, then of course upgrades might break stuff. But if you're using only well-supported extensions, there should be no major problems in most cases. If there are, well, that's what distributions have testing for!
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
2005 releases were the best, I wouldn't use newer only if I have a good reason to do so.
On Mon, Aug 2, 2010 at 4:16 PM, Lane, Ryan Ryan.Lane@ocean.navo.navy.mil wrote:
I haven't read all the documents, but have these researchers taken into account backported fixes?
My gut feeling is that the "preference" for 1.12 is simply due to its inclusion in Debian stable [1]. The maintainer seems to be actively backporting security fixes [2], so while I agree that these versions may enjoy less community support, they should not be considered broken on the basis of the version number alone.
This, of course, unless it is certain that some vulnerabilities are still present in the Debian version. If you are aware of the existence of such a problem, I would recommend you contact security@debian.org. Otherwise, the situation might not be as dangerous as it seems.
On the topic of facilitating upgrades: perhaps we should emphasize the option to install and upgrade using SVN, which is probably very convenient for users that are comfortable with the command line. Moodle has this in the official documentation and I find it very useful [3]. SVN could also be handy as the backend for a user-friendly upgrade procedure, as it already deals with local modifications and such.
As someone who has had their code patched by the debian team, I'd like to take the time to bitch about this.
Firstly, their patches are often incorrect. Secondly, though they've patched my LDAP extension a number of times, I have *never* received a bug report or a patch from them for something they've fixed. It is extremely annoying to see a fix has been around that I could have used months before someone reports a problem to me. Beyond anything else this bothers me the most. They really need to be better community members in regards to this. Lastly, packaging and maintaining such an old version of MediaWiki does a disservice to us, and their users. We don't support versions of MediaWiki that old. I understand that Debian backports security fixes for MediaWiki, but they don't backport new features, and don't backport all bug fixes. Additionally, Debian doesn't backport security fixes for all extensions. Not all extension developers bother maintaining backwards compatibility, and the only possible way to get security fixes is to upgrade MediaWiki and the extension.
Please Debian, keep your version of MediaWiki up to date at least to the oldest stable release, and please send your fixes upstream when you find unfixed bugs.
Respectfully,
Ryan Lane
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 03/08/10 00:16, Lane, Ryan wrote:
Please Debian, keep your version of MediaWiki up to date at least to the oldest stable release, and please send your fixes upstream when you find unfixed bugs.
Debian Stable is stable in the sense that it doesn't change very often, it's not stable in the sense of fewer bugs. If there was a way to fix this, it would have been done a long time ago. Debian is a weird, bureacratic, conservative community, somewhat inscrutable to outsiders. It reminds me of Wikipedia.
On 04/08/10 02:45, Niklas Laxström wrote:
On 3 August 2010 18:14, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
I'm thankful that the Debian MediaWiki package at least *works*. Not that the same can be said of all their packages either (OpenSSL, anyone?). Maybe if we provided .debs and RPMs, people would be less prone to use the distro packages.
That just creates more problems:
- bad quality distro packages
- bad quality our own packages (while we know MediaWiki, we are not
experts in packaging)
- lots of confusion
Last time I looked at our Debian package, it was pretty bad. The custom patches were mostly unnecessary, or could be made unnecessary with a one-line hook, incorporated upstream. However, the worst thing about it was the fact that after you installed it, you then had to run the web-based installer, typing some very specific things into the database fields, in order to make it work.
Installing the package only installs the files, and upgrading the package only upgrades the files, neither operation will touch the database.
I decided that to fix the Debian package, there were two basic things that needed to happen:
1) Write a new installer, that makes it possible for dpkg to trigger DB installs and upgrades. 2) Build a relationship with the Debian maintainer, and in time, perhaps take over their job.
Item 1 was my motivation to start the new-installer branch, but I didn't really get close to finishing it. Luckily some other people have picked up the ball and we might see it in 1.17, although the dpkg interface will probably have to wait until later.
Item 2 would be a procedure along the lines of:
* Write a new package that uses the features of the new installer. * Ask the maintainer to upload this version, explaining how awesome it is. * Integrate Debian package generation into the make-release script. * After each minor release, nag the maintainer to apply the automatically generated patches. * When they get sick of that, ask them to sponsor your request for Debian Developer status. * Upload new packages to Debian on each new release.
The main targets would be Unstable, Testing and Ubuntu Universe. I think Stable is mostly unfixable and not worth bothering with.
I've written a few dpkg packages for Wikimedia's custom repository. It's tedious, there's a steep learning curve, but I don't think it's beyond the capabilities of our core dev team.
-- Tim Starling
My idea for a FHS-friendlier setup was based in storing the LocalSettings for all installed wikis inside /etc/mediawiki.d, all of them pulling from a CommonSettings.php where default overrides and extensions affecting all installs would be stored. The update process would just need to iterate on them running update.php
Then for the installer we could ask the user to overwrite a file with the download, overwrite it ourselves from the web installer or create another installer, this one to be run from command line by dpkg.
Does anyone see a problem with that approach?
On 06/08/10 09:10, Platonides wrote:
My idea for a FHS-friendlier setup was based in storing the LocalSettings for all installed wikis inside /etc/mediawiki.d, all of them pulling from a CommonSettings.php where default overrides and extensions affecting all installs would be stored.
That's basically what it does already, but it does it by patching the setup code. I'd rather see a distributed LocalSettings.php file which pulls in the necessary sub-config files. That can be done without any changes to our source.
The update process would just need to iterate on them running update.php
Then for the installer we could ask the user to overwrite a file with the download, overwrite it ourselves from the web installer or create another installer, this one to be run from command line by dpkg.
Does anyone see a problem with that approach?
The web installer should not be a part of installation from a package at all. We should just get the wiki name from debconf, use the system locale as the language, and install it with defaults otherwise. Then the user will have a working wiki after install.
-- Tim Starling
Tim Starling wrote:
Does anyone see a problem with that approach?
The web installer should not be a part of installation from a package at all. We should just get the wiki name from debconf, use the system locale as the language, and install it with defaults otherwise. Then the user will have a working wiki after install.
-- Tim Starling
I disagree. Existing packages do try to install it in /mediawiki or so, leaving it half-configured for the user to finish by running the web installer but that's not smart. IMHO installing the mediawiki package should provide you with a command like install-mediawiki which would request the location and setup all the server alias needed.
On 06/08/10 09:10, Platonides wrote:
My idea for a FHS-friendlier setup was based in storing the LocalSettings for all installed wikis inside /etc/mediawiki.d, all of them pulling from a CommonSettings.php where default overrides and extensions affecting all installs would be stored.
That's basically what it does already, but it does it by patching the setup code. I'd rather see a distributed LocalSettings.php file which pulls in the necessary sub-config files. That can be done without any changes to our source.
Oh, you mean distributed as in the package? That's another option.Other than the different instructions needed, debian_specific_config.patch could be done by playing with ENV vars.
On Fri, Aug 6, 2010 at 5:40 PM, Platonides Platonides@gmail.com wrote:
Tim Starling wrote:
Does anyone see a problem with that approach?
The web installer should not be a part of installation from a package at all. We should just get the wiki name from debconf, use the system locale as the language, and install it with defaults otherwise. Then the user will have a working wiki after install.
-- Tim Starling
I disagree. Existing packages do try to install it in /mediawiki or so, leaving it half-configured for the user to finish by running the web installer but that's not smart. IMHO installing the mediawiki package should provide you with a command like install-mediawiki which would request the location and setup all the server alias needed.
New installer should fix a lot of this. We'll be including a CLI installer in 1.17, which dpkg or whatever could call to handle the DB setup, etc. Shouldn't need to use the web installer *at all* at that point.
-Chad
On 6 August 2010 22:40, Chad innocentkiller@gmail.com wrote:
New installer should fix a lot of this. We'll be including a CLI installer in 1.17, which dpkg or whatever could call to handle the DB setup, etc. Shouldn't need to use the web installer *at all* at that point.
\o/ THANK YOU, from the bottom of my crusty intranet sysadmin heart.
- d.
On Fri, Aug 6, 2010 at 5:40 PM, Platonides Platonides@gmail.com wrote:
I disagree. Existing packages do try to install it in /mediawiki or so, leaving it half-configured for the user to finish by running the web installer but that's not smart. IMHO installing the mediawiki package should provide you with a command like install-mediawiki which would request the location and setup all the server alias needed.
Getting the info from debconf seems like it makes much more sense than having a separate command you have to run.
Aryeh Gregor wrote:
On Fri, Aug 6, 2010 at 5:40 PM, Platonides Platonides@gmail.com wrote:
I disagree. Existing packages do try to install it in /mediawiki or so, leaving it half-configured for the user to finish by running the web installer but that's not smart. IMHO installing the mediawiki package should provide you with a command like install-mediawiki which would request the location and setup all the server alias needed.
Getting the info from debconf seems like it makes much more sense than having a separate command you have to run.
Can you install several instances of the same package (several wikis) with debconf?
On 08/08/10 00:22, Platonides wrote:
Can you install several instances of the same package (several wikis) with debconf?
I don't think that would be advisable. That's what maintenance/install.php is for. The package maintainer could add a symlink or wrapper script to /usr/bin if they thought that was useful.
It's much the same situation as with MySQL. The package installs a single data directory, and if you want more of them, you can use mysql_install_db.
The goal of a distro package is to offer easy access to simplified functionality. The full capabilities of the software should still be available to users who are capable of using them, at a level of difficulty more or less equivalent to what you'd get with a source install.
-- Tim Starling
On 03/08/10 00:01, Jacopo Corbetta wrote:
I haven't read all the documents, but have these researchers taken into account backported fixes?
No. Their work mostly revolves around defeating version number obfuscation by correlating various properties of the application with the version number. They scanned the Internet to demostrate that their method works, and presented the version number distribution in passing. The security conclusions they drew from that distribution were not particularly rigorous.
My gut feeling is that the "preference" for 1.12 is simply due to its inclusion in Debian stable [1].
They mention seeing spikes in popularity for packaged versions.
The maintainer seems to be actively backporting security fixes [2], so while I agree that these versions may enjoy less community support, they should not be considered broken on the basis of the version number alone.
It's true that backports reduce the problem somewhat. But note that the Debian backports have probably not been reviewed to make sure that they fix the bugs they claim to fix. Or indeed, that they don't create new bugs that are even worse (as Kurt Roeckx did with his famous fix for some spurious valgrind warnings in OpenSSL).
-- Tim Starling
On Thu, Aug 5, 2010 at 10:13 AM, Tim Starling tstarling@wikimedia.org wrote:
Or indeed, that they don't create new bugs that are even worse (as Kurt Roeckx did with his famous fix for some spurious valgrind warnings in OpenSSL).
The onus isn't 100% on Debian, partial blame can be on the OpenSSL team for not saying "Hey that's a stupid idea" when he asked about his 'fix'.
On Thu, Aug 5, 2010 at 11:37 AM, OQ overlordq@gmail.com wrote:
The onus isn't 100% on Debian, partial blame can be on the OpenSSL team for not saying "Hey that's a stupid idea" when he asked about his 'fix'.
The one applying the patch bears full responsibility for what happens. If they don't understand the code, they shouldn't be patching it at all, they should be directing all patches upstream. The Debian maintainer in that case made no more than a cursory effort to upstream the patch, like a typical maintainer. If they had adopted the common-sense policy of not applying any patches except critical security fixes to anything without upstream review (where there was an active upstream), it would never have happened.
Trying to blame upstream in this case legitimizes the current broken status quo where maintainers of all major distros happily apply unnecessary patches that they don't understand, breaking things all over the place as a result. This was only a particularly breathtaking example of the kind of breakage that happens all the time, in MediaWiki just as well. What the Debian maintainer should have done is said "This is an upstream bug, and it's not critical, so take it upstream -- we'll pick up the fix from upstream if they accept it." Period.
(I'll grant that it's reasonable to make exceptions for the sake of platform integration, like modifying it so it will work with the distro's standard compilation options, changing file locations to match FHS, and otherwise getting it to play nice with the system -- *if* the upstream refuses to accept patches. This is a distro's job, and they do have to do it even if upstream doesn't play along. But that was not the case at all for the patch in question, just as it's not the case for a lot of the patches made downstream to MediaWiki.)
wikitech-l@lists.wikimedia.org