The Cunctator wrote:
Again, if we surmise that spiders are causing slowdowns, we should be able to find evidence for that BEFORE we block parts of the site from them. And even then we should see if the fault lies in the site's code.
I think this is right, although blocking them from 'edit' doesn't seem harmful. Certainly, it's good for spiders to hit 'Recent Changes', and often.
Spiders simulate high traffic well, and that's something that wikipedia should be able to handle.
Right.
I'll do some research to determine if spiders are causing any problems, but in my experienced judgment based on running high traffic sites, I think it is pretty unlikely.
--Jimbo