[Mediawiki-l] robots.txt
Michael
mogmios at mlug.missouri.edu
Tue Oct 26 23:04:52 UTC 2004
It might be useful to include a robots.txt file that'll tell search
spiders not to bother with any of the active pages such as 'Edit'. While
it isn't hard to make this kind of file it could be useful to include it
for the sake of giving people a starting place. As most wiki's would
have the same robots.txt file anyway. On most of my sites anyway they
get hit by spiders several times a day so keeping spiders from wasting
time on pages they don't need to index minimizes the wasted server time.
--
Michael <mogmios at mlug.missouri.edu>
http://kavlon.org
More information about the MediaWiki-l
mailing list