Line 252 of wiki/includes/HistoryBlob.php reads: $obj->uncompress();
And $obj is unserialized (and sometimes uncompressed) from the text table.
$obj in the backtrace you gave is a HistoryBlobStub ; and as you point out, HistoryBlobStub does not have an uncompress function. The only thing with an uncompress function is ConcatenatedGzipHistoryBlob.
So as a stab in the dark, if you're feeling lucky, you could try this as an ugly hack and see what happens :
--------------------------------------------- // Save this item for reference; if pulling many // items in a row we'll likely use it again. - $obj->uncompress(); + if( $obj instanceof ConcatenatedGzipHistoryBlob ) { + $obj->uncompress(); + } $wgBlobCache = array( $this->mOldId => $obj ); ---------------------------------------------
All the best, Nick.
-----Original Message----- From: wikitech-l-bounces@lists.wikimedia.org [mailto:wikitech-l-bounces@lists.wikimedia.org]On Behalf Of Travis Derouin Sent: Wednesday, 31 January 2007 10:42 AM To: Wikimedia developers Subject: Re: [Wikitech-l] table text is full
Great. Thanks for the help.
So far so good. I fixed the table by increasing max_rows on the table and the file can now grow past 4GB. I've enabled $wgCompressRevisions and run compressOld.php, just waiting for the nightly backup to optimize the table and it should be ok.
We're seeing this error now while grabbing some old revisions from the history of page, any ideas? It only happens on some articles and I did see some debugging output when I ran compressOld.php. It doesn't appear HistoryBlobStub has an uncompress function:
Fatal error: Call to undefined method HistoryBlobStub::uncompress() in wiki/includes/HistoryBlob.php on line 25
Backtrace: * HistoryBlob.php line 257 calls wfBacktrace() * Revision.php line 533 calls HistoryBlobStub::getText() * Revision.php line 677 calls Revision::getRevisionText() * Revision.php line 435 calls Revision::loadText() * Article.php line 522 calls Revision::getRawText() * Article.php line 384 calls Article::fetchContent() * Article.php line 192 calls Article::loadContent() * Article.php line 863 calls Article::getContent() * Wiki.php line 337 calls Article::view() * Wiki.php line 50 calls MediaWiki::performAction() * index.php line 123 calls MediaWiki::initialize()
$flags are:
Array ( [0] => object [1] => utf-8 )
$obj before being unserialized is:
O:15:"HistoryBlobStub":3:{s:6:"mOldId";s:6:"246631";s:5:"mHash";s:32:"c87a9d76bbd59c9a1218c5620fc889dd";s:4:"mRef";s:6:"246631";}
Any help would be appreciated,
Thanks, Travis
On 1/27/07, Brion Vibber brion@pobox.com wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Travis Derouin wrote:
We have reached the maximum table size limit of 4GB for our text table
- what's the best way around or to fix this?
Well, the best is probably to use InnoDB tables instead of MyISAM. :)
But see the general documentation which shows a number of possible issues and workarounds: http://dev.mysql.com/doc/refman/5.0/en/full-table.html
Also consider using compression in MediaWiki to save space ($wgCompressRevisions to gzip newly saved revisions, and/or run maintenance/storage/compressOld.php to perform batch compression.) But fix the table first!
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.2.2 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFFuv2twRnhpk1wk44RAuXaAJ95Vo66bjCOx+CBLONInaeyuje8tQCfV7Nn l7/55qarL9U3NvEpnCbr//8= =aFUU -----END PGP SIGNATURE-----
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l