Jared Williams wrote:
Here's the paper (PDF)
http://sdch.googlegroups.com/web/Shared_Dictionary_Compression_over_HTTP.pdf
?gda=Cn21OV0AAADesD7oVzP2tIH3YMhCCYbwV7wKw6Y_LNfrKuXmihkMeg12alwZyuoqsE-BiY8
8xfLrk0HuZRJs1gcUl6mErWX6yPI8Lq4cE5IelfQO528z8OU2_747KStNgkfeVUa7Znk
The idea being you could get a sdch capable user agent to download the
concated & gzipped javascript in a single request (called a dictionary),
which quick testing is about 15kb for
en.mediawiki.org, cache that on the
client for a long period.
Then the individual requests for javascript can just return a diff (in RFC
3284) between the server version and the client version has in its
dictionary.
Obviously if the diffs get too large, can instruct the user agent to
download a more up to date dictionary. Something around 30 bytes of body
(off the top of my head) is the minimum size if the server & client versions
are identical.
CSS also could be managed similarly.
Also possible to put the static (inline html) in templates into the
dictionary, together with a lot of the translated messages, to try and
reduce the HTML size, though not sure how effective it'd be.
Jared
I don't see how it will help. The client still needs to download it. Be
it the dictionary or the JS.
What could be benefitted from SDCH are is skin. Instead of transmitting
the skin, diff borders, etc. all of that would be in the dictionary,
decreasing the transfer per page.