Hi,
What happened to robots.txt? Why is this happening?
$ lynx -dump http://en.wikipedia.org/robots.txt # Special robots.txt for nl.wikimedia.org # No crawlers
User-agent: * Disallow: /
No google, no Wayback Machine, no nothing... )-:
regards, Gerrit.
Gerrit Holl schreef:
Hi,
What happened to robots.txt? Why is this happening?
$ lynx -dump http://en.wikipedia.org/robots.txt # Special robots.txt for nl.wikimedia.org # No crawlers
User-agent: * Disallow: /
No google, no Wayback Machine, no nothing... )-:
regards, Gerrit.
But better speed because a strong drop of visitors.
It looks like a mistake. I think the idea was to hide only nl.wikimedia.org form the search engines but it is active for all wikis.
Walter
wikipedia-l@lists.wikimedia.org