On Dec 22, 2007 1:30 PM, Agon S. Buchholz asb@kefk.net wrote:
I've tried so far (a) to mimic the setup of the Wikimedia wikis with a similar robots.txt,
Oh? Have you blocked /w/? How about special pages?
and (b) to enable file caching in MediaWiki; this might have halped a bit, but didn't solve the basic problem (i've to stop and restart mysqld every few hours).
I don't see why file caching would really help here. Other forms of caching, however, such as eAccelerator or APC bytecode caching, and/or memcached, could dramatically improve your wiki's ability to handle traffic.