[Wikipedia-l] robots and spiders

General Wesc (LKBM) genwesc at subdimension.com
Fri May 17 20:04:53 UTC 2002


The Cunctator wrote:
> On 5/17/02 3:26 PM, "Chuck Smith" <msochuck at yahoo.com> wrote:
[Snip]
>>
>>I COMPLETELY disagree with this.  Let the robots crawl
>>everything.  It's better that someone finds one of our
>>Talk or User pages and cruises on over to our main
>>site than to simply find a completely website!
>>
>>Chuck
>>
> 
> I agree with Chuck, strongly.
> 
> [Wikipedia-l]
> To manage your subscription to this list, please go here:
> http://www.nupedia.com/mailman/listinfo/wikipedia-l
> 

I have to wonder though...if a spider goes to Recent Changes and then to 
"Last 5000 changes" (and last 90 days, and last 30 days, and last 2500 
changes, and last 1000 changes, and every such combination) it seems to 
me the server load could get pretty high. Perhaps talk pages should be 
spidered, but not recent changes or the history (diff/changes).

-- 
General Wesc,
http://wescnet.cjb.net/




More information about the Wikipedia-l mailing list