Hi All,
Wikipedia's robots.txt file (http://www.wikipedia.org/robots.txt) excludes robots from action pages (edit, history, etc.) with this:
User-agent: * Disallow: /w/
But in the interest of short URLs, I serve my MediaWiki directly from site / without any /wiki/ or /w/ directories. So above meathod would not work on my installation.
Any ideas how I can exclude robots from crawling all my wiki's edit, history, talk, etc, pages *without* excluding its article pages?
Thanks,
Roger Chrisman http://Wikigogy.org (MediaWiki 1.6.7)