Hi,
We recently added a Squid cache for our wiki, for the most part it's working out quite well.
Although, it doesn't look like popular articles are being cached properly. I've followed the steps listed here: http://meta.wikimedia.org/wiki/Squid_caching, the only change being is that we have a redirect script set up to redirect hostnames, described here http://wiki.ehow.com/Implement-Redirects-in-Squid, although removing this doesn't seem to help the cause.
What we are seeing is that Squid isn't caching articles, but keeps on requesting the same article from Apache. It does look like the headers are being set from Mediawiki:
HTTP/1.1 200 OK Date: Thu, 01 Dec 2005 19:31:03 GMT Server: Apache/2.0.46 (Red Hat) X-Powered-By: PHP/4.3.10 Content-language: en Vary: Accept-Encoding,Cookie Cache-Control: s-maxage=18000, must-revalidate, max-age=0 Last-modified: Thu, 24 Nov 2005 18:32:18 GMT Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=utf-8
But by tailing the logs and looking at how many times a popular page has been requested today, Apache is returning 200 codes for the same page just as many times as it is being requested from Squid, so Squid isn't serving any cached versions of the page.
If the page hasn't changed, shouldn't it be returning a 304 header response saying it hasn't been changed if Squid was working properly? Should the Expires header be set as well?
Any ideas on how this can be fixed, or how to find out what's going on?
Thanks, Travis