On Fri, Jun 26, 2009 at 5:24 PM, Michael Dalemdale@wikimedia.org wrote:
The difference in the context of the script-loader is we would read the version from the mediaWiki js pages that are being included and the $wgStyleVersion var. (avoiding the need to shift reload) ... in the context of rendering a normal page with dozens of template lookups I don't see this a particularly costly. Its a few extra getLatestRevID title calls.
It's not costly unless we have to purge Squid for everything, which probably we don't. People could just use old versions, it's not *that* dangerous.
Likewise we should do this for images so we can send the cache forever header (bug 17577) avoiding a bunch of 304 requests.
Any given image is not included on every single page on the wiki. Purging a few thousand pages from Squid on an image reupload (should be rare for such a heavily-used image) is okay. Purging every single page on the wiki is not.
One part I am not completely clear on is how we avoid lots of simultaneous requests to the scriptLoader when it first generates the JavaScript to be cached on the squids, but other stuff must be throttled too no? Like when we update any code, language msgs, or local-settings does that does not result in the immediate purging all of wikipedia.
No. We don't purge Squid on these events, we just let people see old copies. Of course, this doesn't normally apply to registered users (who usually [always?] get Squid misses), or to pages that aren't cached (edit, history, . . .).