-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
A lot of the stuff sounds cool, so don't take it personally if I'm harsh. I'm presenting logical concerns as I see it.
Gregory Szorc wrote:
In case anyone hasn't noticed, the number of MediaWiki extensions in existence has soared in the previous year. They are scattered all around the internet and it is a chore to make sure all of your extensions are up to date.
Well, perhaps one should be a bit more discerning on which MediaWiki extensions to pick. If they're pre-beta and don't have an update announce-list, is this something you really want to be running?
Step 1: Overhaul how MediaWiki deals with extensions. Loading an extension via 'require_once' is silly and has all sorts of limitations (for example, if your extension file which modifies $wgExtensionFunctions is loaded from within a function, $wgExtensionFunctions won't actually get modified unless it is brought into scope of the calling function).
That shouldn't be a problem: the includes should be done unconditionally and the registered extension functions should serve only to initialize the extension.
In addition, there is no easy way to tell if an extension is a special page extension, parser hook extension, combination, etc.
I agree, that is troublesome, and is addressed below.
In my proposed system, MediaWiki extensions would all be derived from a base 'Extension" class. There would be interfaces that would allow extensions to become a SpecialPage extension, parser extension, hook extension, etc. Furthermore, if extensions were packaged as a class, we could give the base extension class useful variables, such as "sourceURL" which would allow developers to provide a URL to the most up-to-date version of an extension. Of course, the ultimate benefit to turning extensions into classes is that it would make developing extensions easier since OOP gives you a building block for your work, not a clean slate.
I'm not the Extension supertype would make much sense, as extensions in different areas of MediaWiki have radically different needs/APIs. As for SpecialPage, there's already the class you describe. sourceURL is already implemented with $wgExtensionCredits['extension-name'][] = array( 'url' => 'URL' ); (this needs to be better documented).
Overall, read Tim Starling's recent proposal ( http://permalink.gmane.org/gmane.science.linguistics.wikipedia.technical/250... ) to overhaul large sections of code into modules in order to improve performance. Also, read about one developer's sentiments on breaking backwards compatibility with extensions ( http://permalink.gmane.org/gmane.science.linguistics.wikipedia.technical/250... ). Work will move forward, albeit slowly. Tread carefully.
Step 2: Write a manager for MediaWiki that allows you to load and upgrade extensions remotely. Want to upgrade an extension? Just go to a special page, hit the button to refresh the list for updates, and click the checkbox next to the extension you want to update.
This, of course, would be an extension itself.
I would argue that such a feature, though neat, would never see the light of day in svn.wikimedia.org, because there are far too many implications:
* Extension trust - what this essentially means is that you give a group (often one person) rights to arbitrarily write code on your server. Sure, we could implement code diffs, but not everyone has the expertise to audit everything that goes on their server. Existing systems like the PEAR installer go through a regulatory system, and plus appeal mainly to people savvy enough to get PEAR installed in the first place.
* PHP code is writeable - It is often a good idea not to give your PHP scripts write access to the web directory. This prevents someone from exploiting a script installed and then writing something else into a directory. Granted, however, that this is only for those who keep an eye on security, the rest will just 777 their web directories...
* Running off the web is awkward - There's a reason why MediaWiki's upgrade.php is meant for command line: upgrading can take a long time, and it's not good for web scripts to just hang like that when you need to upgrade.
* You can already do it via Subversion - `svn up` anyone? (Of course, you need SSH access though, but it's mind-numbingly easy. I use this method to keep a tip-top development copy of MediaWiki and its extensions on my computer as well as production installation on a public website (Dreamhost for the win!)).
Critics out there will retort that this will slow things down. Yes, it won't be as fast as explicitly typing require_once in LocalSettings.php. However, the system could also be designed with speed in mind. For example, it would be possible to serialize all the loaded extension objects into a file (or shared memory) which is loaded for every page request. I take this approach with my new Farmer extension ( http://www.mediawiki.org/wiki/User:IndyGreg/Farmer), which allows you to specify which extensions are loaded via a web interface. The performance hit is negligible.
Tim Starling's proposal! Although: serializing the extension objects into a file doesn't prevent the need for require'ing them. That's irrelevant, however, because if includes are the issue, you really ought to install a compiler cache.
Thoughts?
You asked, I gave 'em. Feel free to retort!