[Foundation-l] Wikimedia and Environment
Domas Mituzas
midom.lists at gmail.com
Sun Dec 13 11:10:39 UTC 2009
Hi!!!
> 1. Php is very hard to optimize.
No, PHP is much easier to optimize (read - performance oriented refactoring).
> 3. Even python is easier to optimize than php.
Python's main design idea is readability. What is readable, is easier to refactor too, right? :)
> 4. The other questions are, does it make sense to have such a
> centralized client server architecture? We have been talking about
> using a distributed vcs for mediawiki.
Lunatics without any idea of stuff being done inside the engine talk about distribution. Let them!
> 5. Well, now even if the mediawiki is fully distributed, it will cost
> CPU, but that will be distributed. Each edit that has to be copied
> will cause work to be done. In a distributed system even more work in
> total.
Indeed, distribution raises costs.
> 6. Now, I have been wondering anyway who is the benefactor of all
> these millions spend on bandwidth, where do they go to anyway? What
> about making a wikipedia network and have the people who want to
> access it pay instead of having us pay to give it away? With these
> millions you can buy a lot of routers and cables.
LOL. There's quite some competition in network department, and it has become economy of scale (or of serving youtube) long ago.
> 7. Now, back to the optimization. Lets say you were able to optimize
> the program. We would identify the major cpu burners and optimize them
> out. That does not solve the problem. Because I would think that the
> php program is only a small part of the entire issue. The fact that
> the data is flowing in a certain wasteful way is the cause of the
> waste, not the program itself. Even if it would be much more efficient
> and moving around data that is not needed, the data is not needed.
We can have new kind of Wikipedia. The one where we serve blank pages, and people imagine content in it. We\ve done that with moderate success quite often.
> So if you have 10 people collaborating on a topic, only the results of
> that work will be checked into the central server. the decentralized
> communication would be between fewer parties and reduce the resources
> used.
Except that you still need tracker to handle all that, and resolve conflicts, as still, there're no good methods of resolving conflicts with small number of untrusted entities.
> see also :
> http://strategy.wikimedia.org/wiki/Proposal:A_MediaWiki_Parser_in_C
How much would that save?
Domas
More information about the foundation-l
mailing list