----- Original Message ----- From: "Axel Boldt" axel@uni-paderborn.de To: wikitech-l@nupedia.com Sent: Friday, May 17, 2002 1:43 AM Subject: [Wikitech-l] Cause for slowdowns: spiders?
| Right now, I'm seeing nice and fast responses, except every | once in a while everything slows to a halt. If that's due to our | script, then there must be some really bad, really rare special | function somewhere. I doubt that. | | Maybe the slowdowns are due to spiders that hit our site and request | several pages at once, in parallel, like many of these multithreaded | programs do. I read somewhere that everything2.com for this very | reason has disallowed spiders completely and doesn't even allow Google | to index their site anymore.
Robots can be useful as probably Google contributes a lot to bringing new visitors to the site. Being a community this surely is a usefull thing. An issue worth considering is banning robots from special pages, Recent Changes, Talk, User, etc. by an appropriate META tag. IMHO, robots and spiders should only be allowed to go over the articles.
<snip>
regards, [[user:WojPob]]