I would quickly add that the script-loader / new-upload branch also
supports minify along with associating unique id's grouping & gziping.
So all your mediaWiki page includes are tied to their version numbers
and can be cached forever without 304 requests by the client or _shift_
reload to get new js.
Plus it works with all the static file based js includes as well. If a
given set of files is constantly requested we can group them to avoid
server round trips. And finally it lets us localize msg and package that
in the JS (again avoiding separate trips for javascript interface msgs)
for more info see the ~slightly outdated~ document:
I'm going to mention this here, because it might
be of interest on the
Wikimedia cluster (or it might not).
Last night I deposited Extension:Minify which is essentially a
lightweight wrapper for the YUI CSS compressor and JSMin JavaScript
compressor. If installed it automatically captures all content
exported through action=raw and precompresses it by removing comments,
formatting, and other human readable elements. All of the helpful
elements still remain on the Mediawiki: pages, but they just don't get
sent to users.
Currently each page served to anons references 6 CSS/JS pages
dynamically prepared by Mediawiki, of which 4 would be needed in the
most common situation of viewing content online (i.e. assuming
media="print" and media="handheld" are not downloaded in the typical
case).
These 4 pages, Mediawiki:Common.css, Mediawiki:Monobook.css, gen=css,
and gen=js comprise about 60 kB on the English Wikipedia. (I'm using
enwiki as a benchmark, but Commons and dewiki also have similar
numbers to those discussed below.)
After gzip compression, which I assume is available on most HTTP
transactions these days, they total 17039 bytes. The comparable
numbers if Minify is applied are 35 kB raw and 9980 after gzip, for a
savings of 7 kB or about 40% of the total file size.
Now in practical terms 7 kB could shave ~1.5s off a 36 kbps dialup
connection. Or given Erik Zachte's observation that action=raw is
called 500 million times per day, and assuming up to 7 kB / 4 savings
per call, could shave up to 900 GB off of Wikimedia's daily traffic.
(In practice, it would probably be somewhat less. 900 GB seems to be
slightly under 2% of Wikimedia's total daily traffic if I am reading
the charts correctly.)
Anyway, that's the use case (such as it is): slightly faster initial
downloads and a small but probably measurable impact on total
bandwidth. The trade-off of course being that users receive CSS and
JS pages from action=raw that are largely unreadable. The extension
exists if Wikimedia is interested, though to be honest I primarily
created it for use with my own more tightly bandwidth constrained
sites.
-Robert Rohde
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l