Sy Ali <sy1234(a)gmail.com> escribió: On 9/25/06, Roger Chrisman wrote:
But in the interest of short URLs, I serve my
MediaWiki directly from
site / without any /wiki/ or /w/ directories. So above meathod would
not work on my installation.
Any ideas how I can exclude robots from crawling all my wiki's edit,
history, talk, etc, pages *without* excluding its article pages?
I do the same thing, and I never did figure out the rules to disallow
the other sub-pages.
As I understand, there are "nofol" tags within the web pages itself,
but I'm not certain that's being honoured.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
--
Ing. Raúl Vera
CIFH - UNSE
---------------------------------
LLama Gratis a cualquier PC del Mundo.
Llamadas a fijos y móviles desde 1 céntimo por minuto.
http://es.voice.yahoo.com