I have just gotten the following error message
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 199809 bytes) in /home/wiki/wikiSanMar/includes/Parser.php on line 968
This happened after creating/editing 9 new documents. Mostly tables ranging form 12 columns x 20 rows to 12 columns x 500 rows
Ideas why and how to prevent in the future would be welcomed
Thanks
DSig David Tod Sigafoos | SANMAR Corporation
On 07/08/07, Dave Sigafoos davesigafoos@sanmar.com wrote:
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 199809 bytes) in /home/wiki/wikiSanMar/includes/Parser.php on line 968
This happened after creating/editing 9 new documents. Mostly tables ranging form 12 columns x 20 rows to 12 columns x 500 rows
Ideas why and how to prevent in the future would be welcomed
Large documents, perhaps? Parsing is one of our most intensive (resources, time) operations, consisting of a large number of regular expressions working in some sort of organised chaos; a haphazard harmony of edge cases, so it's quite plausible that parsing a large amount of wiki text can lead to memory limit exhaustion if you're doing a large page with a lowish limit.
There is a stone tablet somewhere in Santa Ana, which states that a man shall go insane before he truly understands the way of the Parser.
Rob Church
".. consisting of a large number of regular expressions .."
I knew regular expressions would be my undoing. Nothing sneaker than calling something 'regular' when it is anything but <G>
".. a haphazard harmony of edge cases, so it's quite plausible that parsing a large amount of wiki text can lead to memory limit exhaustion if you're doing a large page with a lowish limit.."
Is there a way to up the 'lowish limit'?
"There is a stone tablet somewhere in Santa Ana, which states that a man shall go insane before he truly understands the way of the Parser."
So .. which of you MW guys are the insane one .. I would hate to guess <G>
Thanks
DSig David Tod Sigafoos | SANMAR Corporation PICK Guy 206-770-5585 davesigafoos@sanmar.com
-----Original Message----- From: mediawiki-l-bounces@lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Rob Church Sent: Tuesday, August 07, 2007 18:28 To: MediaWiki announcements and site admin list Subject: Re: [Mediawiki-l] parser allocation error message
On 07/08/07, Dave Sigafoos davesigafoos@sanmar.com wrote:
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 199809 bytes) in /home/wiki/wikiSanMar/includes/Parser.php on line 968
This happened after creating/editing 9 new documents. Mostly tables ranging form 12 columns x 20 rows to 12 columns x 500 rows
Ideas why and how to prevent in the future would be welcomed
Large documents, perhaps? Parsing is one of our most intensive (resources, time) operations, consisting of a large number of regular expressions working in some sort of organised chaos; a haphazard harmony of edge cases, so it's quite plausible that parsing a large amount of wiki text can lead to memory limit exhaustion if you're doing a large page with a lowish limit.
There is a stone tablet somewhere in Santa Ana, which states that a man shall go insane before he truly understands the way of the Parser.
Rob Church
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Have a look at your php.ini-file (living in directory with the name php on a windows installation and in a directory named etc on a linux machine, the absolute position of the directory is afaik dependent on your web-server/php-installation) and search for the parameter memory_limit. Setting it to a higher value solved a similar problem for me.
Btw.: Does anybody know if there is a optimal size for that parameter (perhaps in relation to the system-memory?)?
Good luck!
Katharina
Dave Sigafoos schrieb:
I have just gotten the following error message
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 199809 bytes) in /home/wiki/wikiSanMar/includes/Parser.php on line 968
This happened after creating/editing 9 new documents. Mostly tables ranging form 12 columns x 20 rows to 12 columns x 500 rows
Ideas why and how to prevent in the future would be welcomed
Thanks a lot. That appears to have taken care of it. Seems that my 'localhost' is set at 128 meg and production server was 16meg <sigh>. Probably not too many pages that will break the 128 <G>
Thanks again
DSig David Tod Sigafoos | SANMAR Corporation
-----Original Message----- From: mediawiki-l-bounces@lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Katharina Wolkwitz Sent: Wednesday, August 08, 2007 0:05 To: MediaWiki announcements and site admin list Subject: Re: [Mediawiki-l] parser allocation error message
Have a look at your php.ini-file (living in directory with the name php on a windows installation and in a directory named etc on a linux machine, the absolute position of the directory is afaik dependent on your web-server/php-installation) and search for the parameter memory_limit. Setting it to a higher value solved a similar problem for me.
Btw.: Does anybody know if there is a optimal size for that parameter (perhaps in relation to the system-memory?)?
Good luck!
Katharina
Dave Sigafoos schrieb:
I have just gotten the following error message
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 199809 bytes) in /home/wiki/wikiSanMar/includes/Parser.php on line 968
This happened after creating/editing 9 new documents. Mostly tables ranging form 12 columns x 20 rows to 12 columns x 500 rows
Ideas why and how to prevent in the future would be welcomed
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org