As a neophyte, I found your comparisons very interesting. I think journalists commenting on our speed might find it interesting as well ;-) Thanks for this.
Anthere
Neil Harris a écrit:
Just to take this a bit further. I thought I'd compare Wikipedia with one of the "well-run sites" that we are supposed to be competing with. Google is a good direct comparison, because of its dynamic content, with cachable frequent queries.
Looking at the difference in traffic between Google and Wikipedia on Alexa shows that:
- Wikipedia has 300 page views per million
- Google has 16,000 page views per million
Thus, Google serves roughly 53 times the number of page views compared to Wikipedia.
However,
- Wikipedia currently has 39 servers
- Google has an estimated 50,000 - 100,000 servers in its worldwide farm
of clusters Thus, Google has roughly 1250 - 2500 times as many servers as Wikipedia [Source: http://www.tnl.net/blog/entry/How_many_Google_machines for an estimate for April last year, and allowing for more recent expenditure]
Thus, we might regard Wikipedia as being roughly 24 to 48 times more "efficient" in its use of hardware than Google. Given that Google has spent over $250M on hardware, to obtain reasonable parity for our developers to be expected to compete with Google at our current traffic we should have around 1000 - 2000 high-performance servers, at a cost of several million US$.
So, a reasonable answer to critics seems to be:
- the developers are already doing very well indeed coping with the
combination of extremely high demand and very limited resources
- they already know there are big growth and capacity problems, and are
working hard on scalability and reliability
- send money, rather than complaining
-- Neil