[Mediawiki-l] robots.txt

Raúl Vera irdvs at yahoo.es
Wed Oct 4 13:01:20 UTC 2006



Sy Ali <sy1234 at gmail.com> escribió:  On 9/25/06, Roger Chrisman wrote:
> But in the interest of short URLs, I serve my MediaWiki directly from
> site / without any /wiki/ or /w/ directories. So above meathod would
> not work on my installation.
>
> Any ideas how I can exclude robots from crawling all my wiki's edit,
> history, talk, etc, pages *without* excluding its article pages?

I do the same thing, and I never did figure out the rules to disallow
the other sub-pages.

As I understand, there are "nofol" tags within the web pages itself,
but I'm not certain that's being honoured.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l at Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l



--
Ing. Raúl Vera
CIFH - UNSE
 		
---------------------------------

LLama Gratis a cualquier PC del Mundo.
Llamadas a fijos y móviles desde 1 céntimo por minuto.
http://es.voice.yahoo.com


More information about the MediaWiki-l mailing list