correct me if I am wrong but thats how we presently update js and css.. we have $wgStyleVersion and when that gets updated we send out fresh pages with html pointing to js with $wgStyleVersion append.
The difference in the context of the script-loader is we would read the version from the mediaWiki js pages that are being included and the $wgStyleVersion var. (avoiding the need to shift reload) ... in the context of rendering a normal page with dozens of template lookups I don't see this a particularly costly. Its a few extra getLatestRevID title calls. Likewise we should do this for images so we can send the cache forever header (bug 17577) avoiding a bunch of 304 requests.
One part I am not completely clear on is how we avoid lots of simultaneous requests to the scriptLoader when it first generates the JavaScript to be cached on the squids, but other stuff must be throttled too no? Like when we update any code, language msgs, or local-settings does that does not result in the immediate purging all of wikipedia.
--michael
Gregory Maxwell wrote:
On Fri, Jun 26, 2009 at 4:33 PM, Michael Dalemdale@wikimedia.org wrote:
I would quickly add that the script-loader / new-upload branch also supports minify along with associating unique id's grouping & gziping.
So all your mediaWiki page includes are tied to their version numbers and can be cached forever without 304 requests by the client or _shift_ reload to get new js.
Hm. Unique ids?
Does this mean the every page on the site must be purged from the caches to cause all requests to see a new version number?
Is there also some pending squid patch to let it jam in a new ID number on the fly for every request? Or have I misunderstood what this does?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l