Haha true about rotting RAM, I'll look into that. I am not using wikimedia style URL's, sadly :( it just didn't happen when the site was first set up and I can't move it now, for various reasons. All of my files are in the web-root /. However, through an apache alias, my url is mywiki.com/Pagename.
How would robots.txt look for that? Would I simply drop the preceeding /wiki, like this?
User-agent: * Disallow: /Special:Search Disallow: /Special:Random
Thanks a ton!
On Sun, Feb 1, 2009 at 1:47 AM, Benjamin Lees emufarmers@gmail.com wrote:
Assuming you have Wikimedia-style URLs: User-agent: * Disallow: /w/ Disallow: /wiki/Special:Search Disallow: /wiki/Special:Random
Your server will be able to handle a lot more if you set up as much caching as you can http://www.mediawiki.org/wiki/Manual:Cache. No sense letting all that spare RAM rot. :)
On Sat, Jan 31, 2009 at 10:02 PM, Philip Beach <beachboy4231@gmail.com
wrote:
I already have checked the access logs. It appears that Google and Yahoo are indeed generating a lot of traffic. Good idea Rob, I've been working on this for a while.
Just out of curiosity, what should my robots.txt look like for Mediawiki. Does anything need to be disallowed?
On Sat, Jan 31, 2009 at 8:30 PM, Platonides Platonides@gmail.com
wrote:
You should check the access logs for which is causing the error.
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l