[Mediawiki-l] Memory exhausted?!

Tim Starling tstarling at wikimedia.org
Tue May 1 20:33:18 UTC 2007


Uwe Baumbach wrote:
> Hi there,
> 
> running MW 1.9.3 on Debian Sarge and PHP 5.2.0 from time to time we get
> a error message like this:
> 
> Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to
> allocate 7680 bytes) in
> /data/wiki/wiki-commons/includes/SkinTemplate.php on line 44
> 
> At the beginning it was only at special pages like recent changes or
> article search pages with a big amount of hits.
> In the last time we get this message more often (but randomly), even on
> "normal" pages  (from time to time).
> 
> We have some extensions and hooks and assume that there is "somewhere" a
> programming weakness, that some memory could be allocated and not
> released, or recursive procedures could allocate useless memory or so.
> 
> Now our questions:
> 
> 1. How can we fix the place of the problem, e. g. is there a PHP
> function to dump all allocated vars and consts in a dump file when such
> an exception arises?

You can try var_dump($GLOBALS), but it doesn't dump static class members,
and it crashes on some versions of PHP. Also there's no way to intercept a
fatal error in PHP, so the best you can do is monitor the memory usage
with memory_get_usage() and output some debugging information if it gets
too high. MediaWiki does have a memory profiling facility which will show
you memory usage deltas from function calls.

> 2. What amount of memory is reserved for one WP session/request? As you
> can see we run with 20 Megs. We thought this would be a lot?!

PHP's hashtables (arrays, objects, symbol tables, etc.) are fast but
incredibly inefficient in memory. 20MB is not much, on Wikimedia we have
the limit set to 100MB and we still see errors occasionally. The opcode
arrays are huge too, for a big application like MediaWiki. APC on
Wikimedia uses about 40MB of shared memory.

Note that in PHP 5.2.0, a change was made to the definition of memory
usage which means that usage is apparently increased. When Wikimedia
finally upgrades to 5.2.x, we might have to increase our threshold above
100MB.

We'd be interested to know where the memory goes and whether we can do
anything about it, but raising the limit would be a reasonable action to
take in the meantime.

-- Tim Starling




More information about the MediaWiki-l mailing list