Simetrical wrote:
On 11/29/07, Roan Kattouw roan.kattouw@home.nl wrote:
- Add all requested titles to a LinkBatch [1]
- Return a marker like marker-ifexist-0 and store the ifTrue and ifFalse
values in an array
- In a ParserAfterTidy hook, run LinkBatch::execute() (checks for
existence) and str_replace() all markers
Of course, it can't be ParserAfterTidy: the branches can contain arbitrary wikicode. Instead, the parser must
- (in preprocessor stage) Add each title to a LinkBatch and replace
with a marker,
- (after preprocessor stage) Resolve the LinkBatch to figure out
which path to follow for each,
- (still just after preprocessor stage) Reinvoke the preprocessor on
the path that was followed to produce wikitext for the main parsing pass.
Or alternatively,
- (in preprocessor stage) Add each title to a LinkBatch and replace
with a marker,
(still in preprocessor stage) Preprocess *both* paths,
(after preprocessor stage) Resolve the LinkBatch to figure out
which path to follow for each,
- (still after preprocessor stage) Insert the already-preprocessed
wikitext for that path while discarding the other one,
whichever is more efficient. Nested {{#ifexist:}} have to be considered: they'd probably work a lot better with my second plan, with steps (1) and (2) being executed together and recursively, and step (4) being executed recursively later.
Although your suggested implementation is still not quite how I'd do it, I think you're getting the idea of how complicated it is. I told the guy in #mediawiki who wanted to do 4000 #ifexist calls on a page that optimising this case would take days, and that he should just forget it and find some other way to do what he was doing. Unless someone comes up with some really, really valuable applications that can't be done any other way, I'm happier with just having a limit. If you have days to spend on optimisation work, I have some other ideas for where you should be spending them.
-- Tim Starling