Alex Regh wrote:
Hello!
On Sat, 8 May 2004 10:30:43 +0200, Thomas R. Koll wrote:
Google is NOT read-only after it crawls the whole
world wide web.
A search engine is naturally a whole different and more distributable
(see grub) thing that a database-centric wiki.
There is a company namend Akamai out there, and as far as I know, they
distribute websites like CNN and Microsoft
and (part of) LiveJournal ;-)
, which are (especially the first)
updated frequently, and have some feedback features etc.
I can't claim any expertise on this, but as far as I understood it,
Akamai specialises on serving static content. In the case of
LiveJournal, this is mainly the user pictures - if you delete one and
upload a new one, it will get a new URL, but if you edit an entry, it
still has the same URL, so LiveJournal entries are saved on the central
DB and not on Akamai.
On Wikipedia nothing is immutable under a given URL (not even images),
except perhaps for small things like the stylesheet, but they are not
worth it.
So theoretically it
might be possible to use distributed servers for the WP.
No, I don't think that would be possible.
And in the long term it might be a good idea, too; all
the Wikipedias
are growing fast, and have users from all over the world.
Fast growth and international users are both no reason to employ
distributed servers. They are, however, a reason to employ new and good
hardware and to improve/optimise the software. :-)
Timwi