[Wikipedia-l] Re: robots.txt - why disallow everyone, everywhere?

Walter Vermeir walter at wikipedia.be
Fri Mar 25 20:58:59 UTC 2005


Gerrit Holl schreef:
> Hi,
> 
> What happened to robots.txt?
> Why is this happening?
> 
> $ lynx -dump http://en.wikipedia.org/robots.txt
> # Special robots.txt for nl.wikimedia.org
> # No crawlers
> 
> User-agent: *
> Disallow: /
> 
> No google, no Wayback Machine, no nothing... )-:
> 
> regards,
> Gerrit.

But better speed because a strong drop of visitors.

It looks like a mistake. I think the idea was to hide only 
nl.wikimedia.org form the search engines but it is active for all wikis.


Walter




More information about the Wikipedia-l mailing list