What were you doing? I don't think MediaWiki should be accessing
10,000 pages at once.
If this is about the RefreshLinks script, there is already a thread on that.
On Thu, 9 Dec 2004 22:03:06 +0100, Baeckeroot alain
<al2.baeckeroot(a)laposte.net> wrote:
Hello
Fatal error: Allowed memory size of 67108864 bytes exhausted
(tried to allocate 224 bytes)
in /Big/Wikipedia/mediawiki-1.3.8/includes/Parser.php on line 787
How much memory should i allocate to php ???
the doc on
mediawiki.org explains that for 8M stack, not 64M,
and doesnt give any clue.
i think 64MB is enough to find a link in a less than 200 kB page ?
The script probably have memory leaks, because the error
is not bound to a big page, but occurs after more or less 10 000 pages.
The data base is 100 000 article, and i dont want to allocate 640 MB to php !
Thanks you for any help for debbugging that.
Regards
Alain
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
--
-------------------------------------------------------------------
http://endeavour.zapto.org/astro73/
Thank you to JosephM for inviting me to Gmail!