Hi!
Wikimedia servers would still have to do a tiny bit of work, basically sign stuff and boostrap the peer lists. It could be built, and for a lot less than 1.5 million.
Does anyone talking here about distributed mediawiki has ever realized what tasks are being worked on in our application cluster? Could you please stop speaking bullshit, packed as a "clever stuff", and using foundation-l audience to support you in any way?
Serving a wiki isn't hosting an .iso file, where of course, bandwidth is main cost, and it is easy to offload. ISO files don't change, people don't care about how fast they start getting ISO file, because the transfer is long enough to forget all startup costs. Serving a wiki isn't looking for aliens. If someone turns off the computer, or DSL will go down, aliens won't disappear, now the request will. Nobody really cares about individual packet containing alien information, because it is sent to multiple nodes. Some will reply, some won't. Serving a wiki isn't serving a personal website. It is not single person editing, there's great deal of conflict resolution, possible race conditions, versioning and metadata information. Serving a wiki isn't serving a conventional media website, because it is far more organic in terms of load pattern evolution, or accidental surges. Content formats also come bottom->up, requiring agile development of systems. Serving a wiki means delivering user contributed content thousands collaborated on in few tens of milliseconds. We do succeed this mission and every time we increased responsiveness of the site, we had more users coming.
Please, if you ever again suggest building some distributed wikipedia@home, please please please, do some research. Sure you may look cool and trendy, but... that doesn't help with our goals.
There're no big systems even little bit similar to wikipedia, that are distributed. If you point me to any, I'd gladly review their ideas. Now there are none.
BR,