[Mediawiki-l] 100% CPU runaway with filecache in 1.3.0beta6

Rene Pijlman rene at lab.applinet.nl
Sun Aug 8 09:57:13 UTC 2004


Brion Vibber:
>Rene Pijlman:
>> When I enable the file cache and request a page from Mediawiki,
>> the browser is waiting for the request forever. On the server
>> the page has been created in the cache. It's compressed and the
>> content looks fine (zcat'ed). There's an Apache process
>> continuously running at almost 100% CPU.
>
>Everything seems to work fine on my main test machine (Mac OS X 10.3.4, 
>PHP 4.3.2), but I can confirm this phenomenon on Debian Woody.
>
>(Side note: the file cache doesn't interact well with output-buffered 
>gzipping. Comment out the line that sets that near the top of 
>LocalSettings.php; unfortunately that doesn't solve this problem.)
>
>The output is being written out to the cache file *and sent to the 
>client* but the connection hangs there. I'm not sure why yet...

I noticed that the 100% CPU occurs after index.php has finished,
and after return from the output callback.

My guess is the call to header() in the output callback
saveToFileCache() is not safe. This is writing to the buffer
that the output callback is processing.

    if( $this->useGzip() ) {
        if( wfClientAcceptsGzip() ) {
            header( 'Content-Encoding: gzip' );

Perhaps this confuses PHP. Also, I guess this header doesn't
actually make it into the headers, so perhaps the gzipped data
is confusing something down the line.

-- 
Regards / Groeten,  http://www.leren.nl
René Pijlman        http://www.applinet.nl
  



More information about the MediaWiki-l mailing list