It's probably worth mentioning that this bug is still open: https://bugzilla.wikimedia.org/show_bug.cgi?id=17577
This will save not only traffic on subsequent page views (in this case: http://www.webpagetest.org/result/090218_132826127ab7f254499631e3e688b24b/1/... about 50K), but also improve performance dramatically.
I wonder if anything can be done to at least make it work for local files - I have hard time understanding File vs. LocalFile vs. FSRepo relationships to enable this just for local file system.
It's probably also wise to figure out a way for it to be implemented on non-local repositories too so Wikimedia projects can use it, but I'm completely out of the league here ;)
Thank you,
Sergey
-- Sergey Chernyshev http://www.sergeychernyshev.com/
On Fri, Jun 26, 2009 at 11:42 AM, Robert Rohde rarohde@gmail.com wrote:
I'm going to mention this here, because it might be of interest on the Wikimedia cluster (or it might not).
Last night I deposited Extension:Minify which is essentially a lightweight wrapper for the YUI CSS compressor and JSMin JavaScript compressor. If installed it automatically captures all content exported through action=raw and precompresses it by removing comments, formatting, and other human readable elements. All of the helpful elements still remain on the Mediawiki: pages, but they just don't get sent to users.
Currently each page served to anons references 6 CSS/JS pages dynamically prepared by Mediawiki, of which 4 would be needed in the most common situation of viewing content online (i.e. assuming media="print" and media="handheld" are not downloaded in the typical case).
These 4 pages, Mediawiki:Common.css, Mediawiki:Monobook.css, gen=css, and gen=js comprise about 60 kB on the English Wikipedia. (I'm using enwiki as a benchmark, but Commons and dewiki also have similar numbers to those discussed below.)
After gzip compression, which I assume is available on most HTTP transactions these days, they total 17039 bytes. The comparable numbers if Minify is applied are 35 kB raw and 9980 after gzip, for a savings of 7 kB or about 40% of the total file size.
Now in practical terms 7 kB could shave ~1.5s off a 36 kbps dialup connection. Or given Erik Zachte's observation that action=raw is called 500 million times per day, and assuming up to 7 kB / 4 savings per call, could shave up to 900 GB off of Wikimedia's daily traffic. (In practice, it would probably be somewhat less. 900 GB seems to be slightly under 2% of Wikimedia's total daily traffic if I am reading the charts correctly.)
Anyway, that's the use case (such as it is): slightly faster initial downloads and a small but probably measurable impact on total bandwidth. The trade-off of course being that users receive CSS and JS pages from action=raw that are largely unreadable. The extension exists if Wikimedia is interested, though to be honest I primarily created it for use with my own more tightly bandwidth constrained sites.
-Robert Rohde
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l