Tels wrote:
I have written a small extension and it is very closely modelled to the sample extension. Basically, it uses $wgParser->setHook() and then returns the text that should be included in the output as HTML.
However, for some reasons, the output of my extension is not put straight into the output page. The parser (or whoever :) looks at it, and then checks/transforms it. This creates, IMHO, unec. overhead.
Some steps, eg doBlockLevels(), do run after extensions are inserted into the output. I'm not sure if there's a good reason for this.
How do I stop whatever get's my extension's output and "mangles" it?
Hack up the parser to work differently... :)
Another small question: How would one go and benchmark an extension including the overhead from the wiki? Editing pages via script and measure how long it takes for each submit? Has anybody done this before?
Apache comes with a simple benchmarking tool, 'ab'. You can use this to run a bunch of repeat requests and give back some timing information.
If you disable the parser cache in 1.4, the page will be re-rendered for each hit and your extension will be called each time.
There's also an internal profiling; turn on the debug log and $wgProfiling and internal timings will be dumped to the log. Add appropriate wfProfileIn()/wfProfileOut() calls in your functions to include details on them.
-- brion vibber (brion @ pobox.com)