[Mediawiki-l] robots.txt

Sy Ali sy1234 at gmail.com
Sun Oct 1 08:38:15 UTC 2006


On 9/25/06, Roger Chrisman <roger at rogerchrisman.com> wrote:
> But in the interest of short URLs, I serve my MediaWiki directly from
> site / without any /wiki/ or /w/ directories. So above meathod would
> not work on my installation.
>
> Any ideas how I can exclude robots from crawling all my wiki's edit,
> history, talk, etc, pages *without* excluding its article pages?

I do the same thing, and I never did figure out the rules to disallow
the other sub-pages.

As I understand, there are "nofol" tags within the web pages itself,
but I'm not certain that's being honoured.



More information about the MediaWiki-l mailing list