Assuming you have Wikimedia-style URLs: User-agent: * Disallow: /w/ Disallow: /wiki/Special:Search Disallow: /wiki/Special:Random
Your server will be able to handle a lot more if you set up as much caching as you can http://www.mediawiki.org/wiki/Manual:Cache. No sense letting all that spare RAM rot. :)
On Sat, Jan 31, 2009 at 10:02 PM, Philip Beach beachboy4231@gmail.comwrote:
I already have checked the access logs. It appears that Google and Yahoo are indeed generating a lot of traffic. Good idea Rob, I've been working on this for a while.
Just out of curiosity, what should my robots.txt look like for Mediawiki. Does anything need to be disallowed?
On Sat, Jan 31, 2009 at 8:30 PM, Platonides Platonides@gmail.com wrote:
You should check the access logs for which is causing the error.
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l