On 11/29/07, Roan Kattouw <roan.kattouw(a)home.nl> wrote:
- Add all requested titles to a LinkBatch [1]
- Return a marker like marker-ifexist-0 and store the ifTrue and ifFalse
values in an array
- In a ParserAfterTidy hook, run LinkBatch::execute() (checks for
existence) and str_replace() all markers
Ah, of course. Strip markers. That might not play so well with
extensions that want to prematurely unstrip output, but it could
probably be worked out somehow, if someone wanted to try.
On 11/29/07, Andrew Garrett <andrew(a)epstone.net> wrote:
I thought of this, But "fix the potential denial
of service" was a
slightly bigger issue than "speed up the abuse of a parser function".
You know, to be fair, there are multiple pages *in the software* that
do existence checks for many Titles individually, without LinkBatches,
meaning possibly thousands of queries per page. It's not the *end of
the world*, performance-wise: the queries are all very small and fast,
const queries. But yes, it should be fixed, one way or another, and I
can't argue that this quick fix is worse than nothing.
Sounds somewhat sensible. Assuming we do want to allow
users to make
"does this exist?" queries for more than 100 pages...
Why would we ever not? There's nothing abusive about it, it's a
perfectly logical and reasonable use of the tool. It just happens to
be somewhat inefficient at present. If someone decides to spend the
time fixing that at some point, it would be nice to be able to remove
the limit.