Gabriel and I have been testing the squid code; we've got Larousse set up as a squid serving the test wiki at: http://larousse.wikimedia.org/
(Yes, with a "media")
Seems to be more or less working; we may set it up for en.wikipedia.org soon if things seem well.
-- brion vibber (brion @ pobox.com)
On Feb 1, 2004, at 20:15, Brion Vibber wrote:
Seems to be more or less working; we may set it up for en.wikipedia.org soon if things seem well.
We gave it a quick trial run, but squid crashes after a couple minutes under that load (despite being able to shell out 390 pages per second on a simple benchmark.)
Still looking into things...
-- brion vibber (brion @ pobox.com)
On Feb 1, 2004, at 22:42, Brion Vibber wrote:
We gave it a quick trial run, but squid crashes after a couple minutes under that load (despite being able to shell out 390 pages per second on a simple benchmark.)
Still looking into things...
We downgraded to an earlier revision that sort of works (doesn't crash!) but isn't caching things it really should be able to, like the images and stylesheets.
-- brion vibber (brion @ pobox.com)
On Feb 1, 2004, at 23:57, Brion Vibber wrote:
We downgraded to an earlier revision that sort of works (doesn't crash!) but isn't caching things it really should be able to, like the images and stylesheets.
I recompiled the latest snapshot without ESI support and things seem to work right so far. It's now caching the javascript, CSS, and images as well as common page hits. Yay!
Hopefully as the cache grows it'll help take some load off.
Also, having working squids will make it easier to switch datacenters once the new machines are up. While DNS updates propagate, the old addresses can be squids that forward transparently to the new servers.
-- brion vibber (brion @ pobox.com)
wikitech-l@lists.wikimedia.org