On Wed, Jan 12, 2011 at 6:51 PM, Tim Starling tstarling@wikimedia.orgwrote: [snip]
When I optimise the parse time of particular pages, I don't even use my sysadmin access. The best way to do it is to download the page with all its templates using Special:Export, and then to load it into a local wiki. Parsing large pages is typically CPU-dominated, so you can get a very good approximation without simulating the whole network. Once the page is in your local wiki, you can use whatever profiling tools you like: the MW profiler with extra sections, xdebug, gprof, etc. And you can modify the test cases very easily.
Well, that's the entire point of WP:PERF, at least before it was elevated to acronym apotheosis. One might reword it as "optimize through science, not superstition".
You're exactly the sort of person who can and should worry about performance: you have well-developed debugging skills and significant knowledge of the system internals. By following well-understood logical processes you can very effectively identify performance bottlenecks and find either workarounds (do your template like THIS and it's faster) or fixes (if we make THIS change to the parser or database lookup code, it goes faster).
I'm going to go out on a limb though and say that most people don't themselves have the tools or skills to do that. It's not rocket science, but these are not standard-issue skills. (Maybe they should be, but that's a story for the educational system!)
The next step from thinking there's a problem is to investigate it knowledgeably, which means either *having* those skills already, *developing* them, or *finding* someone else who does.
-- brion