On Wed, Jul 10, 2002 at 10:42:35PM +0100, Imran Ghory wrote:
On 10 Jul 2002, at 4:46, Jimmy Wales wrote:
But I know that this can not continue indefinitely, particularly if the current situation leads to mistrust.
Maybe a good way to progress would be to alter the software so that wikipedia can be run over multiple machines (at the rate wikipedia is growing at the moment it'll need to be able to be distributed/load balanced sometime anyway), with one machine handling the editing and the others just mirroring the static content/search/stats pages.
There isn't really that much static about wikipedia, almost all special pages are dynamic. If there are performance issues at the moment then this is not because the hardware cannot handle it but rather because the programming until recently has been, how shall I put this, less performance oriented.
Actually, I wonder if we should not do exactly the opposite: have one code base with one database that serves all wikipedias. I have the feeling that in terms of software maintenance the non-English wikipedia's are getting a very raw deal indeed; reported bugs are fixed earlier on the English wikipedia and the major software changes always happen there first. I assume this creates a feeling of being left out there in the cold. Having one common system would in my opinion also strengthen the sense of community.
Apart from that we could also tackle some practical issues more easily such as checking if an uploaded file is used anywhere in the wikipedia and having centralized translation tables.
However, this is all far far far away, because we first need to update all wikipedias to the new code base.
-- Jan Hidders