[Wikipedia-l] Re: [Wikitech-l] Cause for slowdowns: spiders?
wojtek pobratyn
wojtek.pobratyn at gmx.net
Fri May 17 12:25:24 UTC 2002
----- Original Message -----
From: "Axel Boldt" <axel at uni-paderborn.de>
To: <wikitech-l at nupedia.com>
Sent: Friday, May 17, 2002 1:43 AM
Subject: [Wikitech-l] Cause for slowdowns: spiders?
| Right now, I'm seeing nice and fast responses, except every
| once in a while everything slows to a halt. If that's due to our
| script, then there must be some really bad, really rare special
| function somewhere. I doubt that.
|
| Maybe the slowdowns are due to spiders that hit our site and request
| several pages at once, in parallel, like many of these multithreaded
| programs do. I read somewhere that everything2.com for this very
| reason has disallowed spiders completely and doesn't even allow Google
| to index their site anymore.
Robots can be useful as probably Google contributes a lot
to bringing new visitors to the site. Being a community this surely is
a usefull thing. An issue worth considering is banning robots from
special pages, Recent Changes, Talk, User, etc. by an appropriate META tag.
IMHO, robots and spiders should only be allowed to go over the articles.
<snip>
regards,
[[user:WojPob]]
More information about the Wikipedia-l
mailing list