Bartosz DziewoĆski wrote:
It's also why #ifexist is expensive: it needs a separate database query for each time it's used, to check for a single page, because it's impossible to determine the list of pages to check in advance.
I'm not sure I understand the impossibility here.
When the expensive parser function count feature was added, I remember this issue being discussed and my memory is that it seemed possible to batch the ifexist lookups in a similar way to how we batch regular internal link lookups against the pagelinks table, but nobody was interested in implementing it at the time.
If the wikitext is parsed/evaluated on page save, I don't see why ifexist lookups would be impossible to batch. We're already using the pagelinks table for the ifexist functionality to properly work, as I understand it (cf. https://phabricator.wikimedia.org/T14019).
MZMcBride