Hello,
I have made some profiling loading my mediawiki main page ( http://www.twenkill.net/wiki/ ) wich use phptal monobook skin and is set to use the french language. For profilling I used "Advanced php profiler".
The first trouble I found is that loading the page use up to 8417176 bytes of memory ! It looks like mediawiki is including every single possible classes script even if it will not use it.
I noticed some possible candidates for fixing:
MEMORY: 1/ Skins: I use the monobook phptal as default skin but the software also loads the three skin classes SkinStandard.php, SkinNostalgia.php, SkinCologneBlue.php . They should only be called if they are the default skins or the current user skin. Save close to 200KB memory.
2/ Feed: The feed class is loaded in Skin.php only to retriew the available feeds (through an array in feed.php), that call should only be made for syndicated pages. Similary, the LogPage, DifferenceEngine and SearchEngine aren't usefull for articles. Save 30KB, 340KB and 200KB respectivly.
CPU: 1/ looks like languagefr->ucfirst use about 20% of the total script time for the sole purpose of uppercasing the first char of a given string by calling strtr(). Maybe the strings could be declared as utf-8 and a php function like uppercase() called instead ? I think Tim Starling looked a bit at this.
2/ wfprofileout use 5% although it's an empty function. Using single quote instead of double seems to have improved it a bit :)
3/ do_html_entity_decode use close to 4% maybe it could be replaced by the php function html_entity_decode() that looks to make the same task (php >= 4.3.0)
That's all for this night. For the memory issue, it looks like a fix would be to instant a user object first (to get its options) then an article object (to see if it's a feed / special page) then create skin.
Ashar-
Hello,
I have made some profiling loading my mediawiki main page ( http://www.twenkill.net/wiki/ ) wich use phptal monobook skin and is set to use the french language. For profilling I used "Advanced php profiler".
Let's not forget intermittent slow queries, we still have a few of them. Watchlists are still quite slow - I haven't looked into how the wl_cache works, but it doesn't seem to use cached data for generating new watchlists (i.e. only lookup changes that aren't in the cache). Ancientpages bogged down the server recently, it doesn't seem to be using the index properly; we may want to add it to the query cache for the time being. Special:Contributions gets slow when people start digging with large offsets. Special:Undelete appears to have no paging, which makes it slow when the archive is large.
Regards,
Erik
Ashar Voultoiz wrote:
Hello,
I have made some profiling loading my mediawiki main page ( http://www.twenkill.net/wiki/ ) wich use phptal monobook skin and is set to use the french language. For profilling I used "Advanced php profiler".
<snip>
CPU: 1/ looks like languagefr->ucfirst use about 20% of the total script time for the sole purpose of uppercasing the first char of a given string by calling strtr(). Maybe the strings could be declared as utf-8 and a php function like uppercase() called instead ? I think Tim Starling looked a bit at this.
2/ wfprofileout use 5% although it's an empty function. Using single quote instead of double seems to have improved it a bit :)
3/ do_html_entity_decode use close to 4% maybe it could be replaced by the php function html_entity_decode() that looks to make the same task (php >= 4.3.0)
I profiled a bit with language set to en and the page: http://www.twenkill.net/wiki/index.php/Parser_Benchmark With profiling disable the wfprofileout and wfprofilein functions still use about 7% of cpu time. Although they don't use a lot of cpu time, both are called about 800 times.
wikitech-l@lists.wikimedia.org