Isam Bayazidi wrote:
Alex Regh wrote:
However, my guess is also that this would require
*a lot* of rewriting
of the
software, so at best this can be a long-term option. But not one I'd
dismiss
right away.
Do I understand from this that the Mediawiki software have a limitation
in that regard ?
Technically, we could set up every wiki on a separate server of its own.
However this would be more difficult to administer, and would require
300 servers in its strictest interpretation. ;) Since our resources are
limited, we want to provide as much as we can to all wikis rather than
favoring some, though generally we've put a lot more restrictions on the
English Wikipedia because it's the biggest and takes the most resources.
I don't know any details about Akamai (which is what Alex was
specifically talking about), but I believe it's some sort of distributed
caching system. Our biggest problems are on the database and apache
servers that do the work of dynamic page generation; more front-end
caches would be less helpful.
-- brion vibber (brion @
pobox.com)