Emufarmers Sangly wrote:
I've tried so far (a) to mimic the setup of the Wikimedia wikis with a similar robots.txt,
Oh? Have you blocked /w/? How about special pages?
I hope so: "User-agent: * Disallow: /w/"
The special pages sections are taken directly from the Wikimedia directives.
I don't see why file caching would really help here. Other forms of caching, however, such as eAccelerator or APC bytecode caching, and/or memcached, could dramatically improve your wiki's ability to handle traffic.
The general idea was to relieve MediaWiki from generating dynamic pages all over again to free server resources for other tasks. I've never thought I'd need something like APC for my (small) site, but I'll check this out.
Thanks! -asb