I still believe my questions have been answered adequately. However,
Why do developers have such priviledged access to the source code
So that they can actually improve it. I don't know what alternative you're suggesting.
This question cannot be viewed outside of the context of the rest of the sentence, so this answer is invalid.
and the community such little input?
Because the community can't write code, or won't.
False: Extension Matrix.
Features will notget implemented unless someone writes the code. Therefore, anyone who cannot write the code themselves will not get their features implemented unless they happen to convince someone with commit access. Anyone who writes reasonably good code and shows mild commitment to the project is given commit access and becomes a developer if they so choose.
That the core developers must approve the code with little to no oversight is exactly my point. The development of MediaWiki should not be based on what the core developers believe the flavor of the day is. It leads to monstrosities such as the current parser. If the community funds MediaWiki development, they should have a very strong say in what features get implemented.
Why must the community 'vote' on extensions such as Semantic MediaWiki
The community is not being asked to vote on SMW that I've heard. If they don't show enough interest, however, it might not be worth the time of one of the few possible extension reviewers to try reviewing such a huge extension.
The developers should not be the ones deciding what their time should be spent on. Especially if it leads to irrational choices. And I am under the impression that various language Wikipedia's can enable SMW if they reach a consensus. Is that wrong?
and yet the developers can implement any feature they like, any way they like it?
No developer can implement any feature without review by Brion.
This is a *major* problem with MediaWiki.
If a developer were to commit a feature as large as SMW, it would be disabled or reverted, for the same reason as given above: nobody with time to review. This has happened with a number of features at various points, like category and image redirects/moves, and rev_deleted. They were all committed by developers but haven't been enabled (or have by now, but weren't for a pretty long time).
And yet major features to MW have been implemented on developer whim. If I were to compile a list of such features, I might go to you for input. I feel like I am preaching to the choir! I'm curious: of all the possible "improvements" to MediaWiki, why do you feel the horrifying "parser" functions were chosen? For the increase in usability? Pfffft.
Why is this tool not being tested on Wikipedia, right now?
http://wiki.ontoprise.com/ontoprisewiki/index.php/Image:Advanced_ontology_br...
Because it hasn't met review for enabling on Wikipedia, and likely won't without major structural changes (for performance reasons). Wikimedia handles 70,000 requests per second peak on 400 servers or so. You cannot do that if you willy-nilly enable code that hasn't been carefully tested in a suitably large production environment. And no non-Wikimedia wiki is anywhere close to the size of Wikipedia.
I do not believe the job of the core developers should be choosing what extensions are enabled. If an extension appears to solve the usability issue, and yet it does not scale, their job is to scale it. And when expert PHP developers write extensions, give talks at Wikimania, provide community support, and do what it takes to develop a thorough understanding of MediaWiki, they should be given a larger voice. Much larger than being ignored altogether.
I would like to make clear that I believe the usability issue has largely
been solved, and the community is just waiting for the core developers,
who
have kept a tight lock and key on the source code, to recognize that.
Can you propose any tenable alternative development model that wouldn't overload the servers or crash the site when people upload code that's buggy or just doesn't scale? We have enough of that checked in with our current procedures. It's only kept at bay because everything is reviewed by one of a few highly trusted people, who have worked on MediaWiki and Wikimedia for several and are intimately familiar with the details of how it works and what's been done before.
You cannot escape that review barrier. Every open-source project has it, and must have it, to avoid their code becoming a complete mess.
I do not dispute review. I dispute the fact that the core developers only review code that suits their fancy.
On Fri, Jan 9, 2009 at 6:13 PM, Brian Brian.Mingus@colorado.edu wrote:
I am skeptical of the current development process. That is because it has led to the current parser, which is not a proper parser at all, and
includes
horrifying syntax.
The current parser is inherited from somewhere between 2001 or 2003. It's possibly even inherited from UseModWiki. It was developed before the current development process was in place, and so has nothing to do with it.
This is partly false. There have been several efforts to write a proper parser, and the current parser has undergone major structural changes. I don't believe a computer scientist would have a huge problem writing a proper parser. Are any of the core developers computer scientists?
On Fri, Jan 9, 2009 at 6:59 PM, Brian Brian.Mingus@colorado.edu wrote:
I believe it my be the case that the often bizarre idiosyncrasies of MediaWiki were implemented
because
the developers were spread out around the world, in isolation,
communicating
only over IRC and sometimes e-mail. I know there are yearly developer
spurts
at Wikimania, but I do not know about the daily development environment
at
the offices, and whether development continues in a largely isolated fashion.
The large majority of new code is written by volunteers in their spare time. These volunteers are not going to be willing or able to move to a centralized place to improve communication, and Wikimedia cannot afford to drop them. In any event, communication over IRC, e-mail, and websites is the universal standard in the open source world, and has resulted in a large number of unquestionably high-quality products, like the Linux kernel and Firefox.
I do not believe that is how Firefox is developed. The linux kernel is another story - it has proper oversight, and Torvald's "network of trust" - 15 crack developers whom he knows well and have written exceptional quality code for him for many years.
I don't know what you want -- more involvement of the community (which is distributed across the world), or less communication by purely electronic means? You can't have both.
The EP suggestion was only related to how the developers in the SF offices should work, so that funds would not be wasted.