Hi Brion,
Yep, I've done that. I've modified DefaultSettings.php:
$wgUseSquid = true; $wgUseESI = false; $wgInternalServer = $wgServer; $wgSquidMaxage = 18000; $wgSquidServers = array('10.234.169.202'); $wgSquidServersNoPurge = array(); $wgMaxSquidPurgeTitles = 400; $wgSquidFastPurge = true;
Squid and Apache are running on different servers, but that shouldn't matter right? If you were referring to something further, can you point me in the right direction? What do you mean by have Squid rewrite that for downstream? Is there any more documentation on this other than the page on meta.wikimedia.org?
Thanks, travis
Travis Derouin wrote:
HTTP/1.1 200 OK Date: Thu, 01 Dec 2005 19:31:03 GMT Server: Apache/2.0.46 (Red Hat) X-Powered-By: PHP/4.3.10 Content-language: en Vary: Accept-Encoding,Cookie Cache-Control: s-maxage=18000, must-revalidate, max-age=0
You need to turn on squid mode, so that MediaWiki sends out a different max-age for cacheable pages, and then have Squid rewrite that for downstream. Then squid won't have to talk to the apache at all for cached hits for anonymous visitors with no session cookies.
Otherwise Squid still has to hit Apache/MediaWiki for everything to check for 304s, which is a lot slower than being able to return directly from cache.
But by tailing the logs and looking at how many times a popular page has been requested today, Apache is returning 200 codes for the same page just as many times as it is being requested from Squid, so Squid isn't serving any cached versions of the page.
If the page hasn't changed, shouldn't it be returning a 304 header response saying it hasn't been changed if Squid was working properly?
Probably should be; you might want to investigate that.
-- brion vibber (brion @ pobox.com)