[Mediawiki-l] robots.txt

Raúl Vera irdvs at yahoo.es
Wed Oct 4 13:01:20 UTC 2006

Sy Ali <sy1234 at gmail.com> escribió:  On 9/25/06, Roger Chrisman wrote:
> But in the interest of short URLs, I serve my MediaWiki directly from
> site / without any /wiki/ or /w/ directories. So above meathod would
> not work on my installation.
> Any ideas how I can exclude robots from crawling all my wiki's edit,
> history, talk, etc, pages *without* excluding its article pages?

I do the same thing, and I never did figure out the rules to disallow
the other sub-pages.

As I understand, there are "nofol" tags within the web pages itself,
but I'm not certain that's being honoured.
MediaWiki-l mailing list
MediaWiki-l at Wikimedia.org

Ing. Raúl Vera

LLama Gratis a cualquier PC del Mundo.
Llamadas a fijos y móviles desde 1 céntimo por minuto.

More information about the MediaWiki-l mailing list