On 10/26/07, Steve Summit scs@eskimo.com wrote:
The question is, how true is it that "almost every very-high-traffic page on Wikipedia is having extreme problems right now". I suspect not, but if so, is it because there are more pages with say, heavy use of the {cite} template, or because templates like {cite} have gotten more complicated, or because template interpolation has somehow gotten slower, or simply because there are more hits and edits being processed every day, such that our headroom is going down?
Well, whatever the problem is, I suspect I know one way that would fix it: rewriting the parser in C(++). Unfortunately, that's a whole lot easier said than done. Rewriting even part of it, though, say replaceVariables, might be a big benefit.
For now it might be best to refine our heuristics of what's slow to render. Currently we use a simple text-length heuristic, but perhaps it would make more sense to incorporate additional criteria. Maximum number of template inclusions? Maximum template depth? It would require testing to see what would be effective.