On 9/25/06, Roger Chrisman <roger(a)rogerchrisman.com> wrote:
But in the interest of short URLs, I serve my
MediaWiki directly from
site / without any /wiki/ or /w/ directories. So above meathod would
not work on my installation.
Any ideas how I can exclude robots from crawling all my wiki's edit,
history, talk, etc, pages *without* excluding its article pages?
I do the same thing, and I never did figure out the rules to disallow
the other sub-pages.
As I understand, there are "nofol" tags within the web pages itself,
but I'm not certain that's being honoured.