On Wed, Jan 12, 2011 at 6:51 PM, Tim Starling <tstarling(a)wikimedia.org> wrote:
I think this is an exaggeration.
When I optimise the parse time of particular pages, I don't even use
my sysadmin access. The best way to do it is to download the page with
all its templates using Special:Export, and then to load it into a
local wiki.
But how do you determine which templates are causing server load
problems? If we could expose enough profiling info to users that they
could figure out what's causing load so that they know their
optimization work is having an effect, I'd be all for encouraging them
to optimize. The problem is that left to their own devices, people
who have no idea what they're talking about make up nonsensical server
load problems, and there's no way for even fairly technical users to
figure out that these people indeed have no idea what they're talking
about. If we can expose clear metrics to users, like amount of CPU
time used per template, then encouraging them to optimize those
specific metrics is certainly a good idea.
Parsing large pages is typically CPU-dominated, so you
can
get a very good approximation without simulating the whole network.
Templates that use a lot of CPU time cause user-visible latency in
addition to server load, and WP:PERF already says it's okay to try
optimizing clear user-visible problems (although it could be less
equivocal about it).