On 11/30/07, Simetrical <Simetrical+wikilist(a)gmail.com> wrote:
Is there any way, instead, to defer these so they can be done all at
once? Tim, do you have any thoughts on whether this is reasonable?
(I would suspect not, but it doesn't hurt to ask.)
I thought of this, But "fix the potential denial of service" was a
slightly bigger issue than "speed up the abuse of a parser function".
Wow, wow, wow, ParserFunctions has a DB query in a
function that can
potentially be called hundreds of times? Domas would go nuts if he knew
that...
You should have heard Tim
Ifexist should do the following:
- Add all requested titles to a LinkBatch [1]
- Return a marker like marker-ifexist-0 and store the ifTrue and ifFalse
values in an array
- In a ParserAfterTidy hook, run LinkBatch::execute() (checks for
existence) and str_replace() all markers
Sounds somewhat sensible. Assuming we do want to allow users to make
"does this exist?" queries for more than 100 pages...
--
Andrew Garrett