On Jun 13, 2015 1:06 AM, "Pine W" <wiki.pine@gmail.com> wrote:
> Perhaps we should take the discussion of how best to measure page rendering performance to Wikitech. Would that be ok with you?

We could. Or maybe the research or analytics lists list would be better.

But should that block getting the SM out the door?

> I agree that there is value in continuity, but remember that Wikipedia articles change over time, so unless someone is using a specific rev for measuring every time that they make a change to how the page renders, then there is likely to be at least some unreliability in the measurement.

Obviously we could double check this but I'd wager that Obama's cite count would have trended upward in the last couple years. (so e.g. if we compared older HHVM vs. newer HHVM with constant Obama rev the gains would be more extreme than if we did older HHVM + older Obama vs. newer HHVM + newer Obama)

Anyway, it should be technically feasible to run benchmarks for old software again against the new revisions. In this case the author wasn't actually comparing to past numbers. (I think...) Only generating his own new numbers for a constant rev. And anyway, the comparison to old numbers wouldn't be meaningful (without rerunning them) because hardware's not constant.

> Technical factors like bandwidth and geolocation may also be involved in skewing the validity of comparisons.

I can't imagine a scenario where that's relevant. Does anyone benchmark specific articles over the public internet? vs. running the client on the same local network as the server.

> For most citations, there appears to be a manually updated list here: https://en.wikipedia.org/wiki/Wikipedia:Articles_with_the_most_references

not just manually updated but each entry has its own separate update date??? hrmmm, Obama is listed lower on that list than another article with Obama in titleā€¦

-Jeremy

P.S. the recently released slow parse logs may be useful for choosing articles to track over time. https://phabricator.wikimedia.org/T98563