[Mediawiki-l] Robot.txt
ekompute
ekompute at gmail.com
Sat Feb 7 12:53:18 UTC 2009
Hi, can anyone help me with my robot.txt. My contents for the page reads as
follows:
User-agent: *
Disallow: /Help
Disallow: /MediaWiki
Disallow: /Template
Disallow: /skins/
But it is blocking pages like:
- http://www.dummipedia.org/Special:Protectedpages
- http://dummipedia.org/Special:Allpages
and external pages like:
- http://www.stumbleupon.com/
- http://www.searchtheweb.com/
As you can see, my robot.txt did not block these pages. Also, should I block
the print version to prevent what Google calls "duplicate content"? If so,
how?
Response will be very much appreciated.
PM Poon
More information about the MediaWiki-l
mailing list