Right now, I'm seeing nice and fast responses, except every once in a while everything slows to a halt. If that's due to our script, then there must be some really bad, really rare special function somewhere. I doubt that.
Maybe the slowdowns are due to spiders that hit our site and request several pages at once, in parallel, like many of these multithreaded programs do. I read somewhere that everything2.com for this very reason has disallowed spiders completely and doesn't even allow Google to index their site anymore.
Maybe we should search the server logs for several rapid requests from the same IP, and try to correlate those to load averages?
Axel
wikitech-l@lists.wikimedia.org