Hi, can anyone help me with my robot.txt. My contents for the page reads as follows:
User-agent: * Disallow: /Help Disallow: /MediaWiki Disallow: /Template Disallow: /skins/
But it is blocking pages like:
- http://www.dummipedia.org/Special:Protectedpages - http://dummipedia.org/Special:Allpages
and external pages like:
- http://www.stumbleupon.com/ - http://www.searchtheweb.com/
As you can see, my robot.txt did not block these pages. Also, should I block the print version to prevent what Google calls "duplicate content"? If so, how?
Response will be very much appreciated.
PM Poon