On Dec 23, 2007 12:44 AM, Agon S. Buchholz asb@kefk.net wrote:
Emufarmers Sangly wrote:
I don't see why file caching would really help here. Other forms of caching, however, such as eAccelerator or APC bytecode caching, and/or memcached, could dramatically improve your wiki's ability to handle
traffic.
The general idea was to relieve MediaWiki from generating dynamic pages all over again to free server resources for other tasks. I've never thought I'd need something like APC for my (small) site, but I'll check this out.
I believe file caching is just for static pages; I don't think it would be of any help here, but PHP caching should improve things.
You might also want to check Apache's logs: Is Googlebot actually hitting your site an inordinate number of times, or is your site just choking on requests to a few particular pages?