-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Gerard Meijssen wrote:
Domas describes the status quo. He does describe it well. It does however not detract one iota from the usefulness of doing this research. Mechanisms are developed that may work at a fraction of our current (ie WMF) cost, for the WMF it is irresponsible to be against such a research project. It does not matter if you think the VU will succeed or not, what matters is that serious effort is put into this endeavour. Just wait and watch what will transpire when it does.
Research projects are great; if it goes somewhere eventually that's super, and if it doesn't that's fine too. :)
Our own resources have to be invested in managing what we know works; we don't benefit from the flamefests on this list every few months when someone hears about SETI@home or BitTorrent and thinks it'd be easy to apply the principle to a wiki so why aren't we doing it we must be incompetent or wasting money OMG! ;)
Good distributed hosting for large numbers of small, fast-changing objects like a wiki is not what we might call a "solved problem". If it's feasible at all, that's something we should leave to researchers better versed in the field for now.
In the *forseeable* future, we expect the primary web site to continue to work much as it does now, with central servers and some limited distribution through centrally-administered proxy caching systems.
We can be more aggressive on other parts of the system, though.
Bulk downloads like data dumps could be done over BitTorrent, but they're not really a significant overall resource drain.
Media files are half our bandwidth, so that is an area which we can see gains from.
Once media storage has been rearranged for better versioning stability (as already specced out) it can be much more aggressively cached, perhaps through content distribution networks such as Coral in addition to more traditional centrally-administered proxy caches.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)