-----Original Message----- From: wikitech-l-bounces@lists.wikimedia.org [mailto:wikitech-l-bounces@lists.wikimedia.org] On Behalf Of Aryeh Gregor Sent: 15 December 2008 22:30 To: Wikimedia developers Subject: Re: [Wikitech-l] Future of Javascript and mediaWiki
On Mon, Dec 15, 2008 at 2:39 PM, Jared Williams jared.williams1@ntlworld.com wrote:
Minification could be made pretty pointless in the future.
Chromium* has experimental tech within it, which can reduce the payload of each js/css request to something as small as 30 bytes.
Jared
- Google toolbar for IE supposedly implements it, but I've
been unable
to get it working.
Link?
Here's the paper (PDF)
http://sdch.googlegroups.com/web/Shared_Dictionary_Compression_over_HTTP.pdf ?gda=Cn21OV0AAADesD7oVzP2tIH3YMhCCYbwV7wKw6Y_LNfrKuXmihkMeg12alwZyuoqsE-BiY8 8xfLrk0HuZRJs1gcUl6mErWX6yPI8Lq4cE5IelfQO528z8OU2_747KStNgkfeVUa7Znk
The idea being you could get a sdch capable user agent to download the concated & gzipped javascript in a single request (called a dictionary), which quick testing is about 15kb for en.mediawiki.org, cache that on the client for a long period.
Then the individual requests for javascript can just return a diff (in RFC 3284) between the server version and the client version has in its dictionary. Obviously if the diffs get too large, can instruct the user agent to download a more up to date dictionary. Something around 30 bytes of body (off the top of my head) is the minimum size if the server & client versions are identical.
CSS also could be managed similarly.
Also possible to put the static (inline html) in templates into the dictionary, together with a lot of the translated messages, to try and reduce the HTML size, though not sure how effective it'd be.
Jared