Hi everyone,
Magnus was very kind to implement an idea that consists of two parts:
1. use of WebP (http://en.wikipedia.org/wiki/WebP) instead of PNG/JPEG for thumbnails in Wikipedia articles 2. use of Data-URIs (https://en.wikipedia.org/wiki/Data_URI_scheme) to inline incluse those thumbnails.
The experiment can be watched using Chrome and Opera browsers at http://toolserver.org/~magnus/wp_data_url.php?lang=en&title=Albrecht_D%C...
Firefox and other Data-URI capable but currently WebP incapable browsers can watch the second part of the experiment at http://toolserver.org/~magnus/wp_data_url.php?lang=en&title=Albrecht_D%C...
Base64 encoding has a ~1/3 overhead. This overhead is reduced when the server sends out gzip files.
Mathias
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably reduce the number of web requests (for https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40 requests).
But the pages get quite bigger, obviously. Also, it introduces a caching issue for images...
Still, pretty cool.
2013/4/22 Mathias Schindler mathias.schindler@gmail.com
Hi everyone,
Magnus was very kind to implement an idea that consists of two parts:
- use of WebP (http://en.wikipedia.org/wiki/WebP) instead of PNG/JPEG
for thumbnails in Wikipedia articles 2. use of Data-URIs (https://en.wikipedia.org/wiki/Data_URI_scheme) to inline incluse those thumbnails.
The experiment can be watched using Chrome and Opera browsers at
http://toolserver.org/~magnus/wp_data_url.php?lang=en&title=Albrecht_D%C...
Firefox and other Data-URI capable but currently WebP incapable browsers can watch the second part of the experiment at
http://toolserver.org/~magnus/wp_data_url.php?lang=en&title=Albrecht_D%C...
Base64 encoding has a ~1/3 overhead. This overhead is reduced when the server sends out gzip files.
Mathias
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Apr 22, 2013 9:18 AM, "Denny Vrandečić" denny.vrandecic@wikimedia.de wrote:
But the pages get quite bigger, obviously. Also, it introduces a caching issue for images...
That's why we have https://bugzilla.wikimedia.org/32618 :-)
Still, pretty cool.
Yeah… makes me wonder what the status of SPDY is. (client-side too)
-Jeremy
On Mon, Apr 22, 2013 at 3:17 PM, Denny Vrandečić denny.vrandecic@wikimedia.de wrote:
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably reduce the number of web requests (for https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40 requests).
But the pages get quite bigger, obviously.
Which one is larger (given an empty cache)? One single big file containing the thumbnails or one small HTML file and individual thumbnail files (that includes possible overhead in the TCP/IP packages for both scenarios)?
Mathias
On Mon, Apr 22, 2013 at 6:30 AM, Mathias Schindler < mathias.schindler@gmail.com> wrote:
Which one is larger (given an empty cache)? One single big file containing the thumbnails or one small HTML file and individual thumbnail files (that includes possible overhead in the TCP/IP packages for both scenarios)?
For very small images, HTTP headers can indeed overwhelm the actual data; that's why we use embedded data URIs in styles a lot, which tend to have very small icons and such.
For typical photo thumbnails there's less relative overhead, but it's still a bunch of extra files to fetch.
There are a couple disadvantages to using data URIs for largish inline images such as photo thumbs though:
* browser can't render all the text layout before downloading the images; images are forced to load at their inline positions. * increased memory usage -- DOM is going to contain the full base64 data URI, whereas a separate image file would only store a small link and go into disk cache. This may be an issue on memory-constrained devices such as smartphones and tablets. * I'm not actually sure how disk caching works with data URIs. :) They may just eat up RAM as long as the page is opened, even when not shown onnscreen.
I love the idea of using more compact image formats when available, but doing the negotiating makes things more complicated. Maybe edge-side includes can rig something up, maybe their overhead would kill the cache servers. Dunno. :)
Another possibility is to preferably load images on view/section expansion via JavaScript, which can potentially give you a chance to query the format compatibility in client JS and avoid any HTTP-level negotiation. (And also doesn't load any images until you need them.) Things to think about...
-- brion
On Mon, Apr 22, 2013 at 8:27 PM, Brion Vibber brion@pobox.com wrote:
Another possibility is to preferably load images on view/section expansion via JavaScript, which can potentially give you a chance to query the format compatibility in client JS and avoid any HTTP-level negotiation. (And also doesn't load any images until you need them.) Things to think about...
format compatibility check is usually done with these few lines: http://queryj.wordpress.com/2012/06/11/detecting-webp-support/
Clever... that technique (loading a data URI of a small file and making sure it works) should work with other formats too. I smell an avenue for SVG support too here... ;)
-- brion
On Mon, Apr 22, 2013 at 12:33 PM, Mathias Schindler < mathias.schindler@gmail.com> wrote:
On Mon, Apr 22, 2013 at 8:27 PM, Brion Vibber brion@pobox.com wrote:
Another possibility is to preferably load images on view/section
expansion
via JavaScript, which can potentially give you a chance to query the
format
compatibility in client JS and avoid any HTTP-level negotiation. (And
also
doesn't load any images until you need them.) Things to think about...
format compatibility check is usually done with these few lines: http://queryj.wordpress.com/2012/06/11/detecting-webp-support/
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Apparently Facebook has been experimenting with sending WebP images in place of JPEG to some users... but there are some usability problems reported in the tech press < http://arstechnica.com/information-technology/2013/04/chicken-meets-egg-with...
Primary problem seems to be that folks saving images locally or sharing direct URLs get surprised when they don't work in other peoples' browsers or can't be opened in other programs.
Not sure what's a good solution for this, other than a really good download/sharing UI on images...
-- brion
I vaguely recall the first time I came across one of these newfangled "PNG" images that no viewer would display. What's wrong with GIF??? That will never stand! (yes, I know plenty is wrong with GIF, especially in those days...)
Pixelmator on Mac already opens webp fine, didn't test saving though. If Wikipedia would offer it, everyone except Apple would just link the library to their code and be "hip" ;-)
On Tue, Apr 23, 2013 at 6:45 PM, Brion Vibber brion@pobox.com wrote:
Apparently Facebook has been experimenting with sending WebP images in place of JPEG to some users... but there are some usability problems reported in the tech press <
http://arstechnica.com/information-technology/2013/04/chicken-meets-egg-with...
Primary problem seems to be that folks saving images locally or sharing direct URLs get surprised when they don't work in other peoples' browsers or can't be opened in other programs.
Not sure what's a good solution for this, other than a really good download/sharing UI on images...
-- brion _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Apr 23, 2013 at 7:45 PM, Brion Vibber brion@pobox.com wrote:
Not sure what's a good solution for this, other than a really good download/sharing UI on images...
Maybe the most obvious solution would be for Mozilla and Microsoft and the usual bunch of image editors to start supporting WebP. There are many opportunities for small tools that can help to make the transition smooth, including "Save this JPEG as WebP", "Save this WebP as JPEG" and - most importantly - "Save the uncompressed version of this thumbnail instead"
Mathias
On Mon, Apr 22, 2013 at 2:17 PM, Denny Vrandečić < denny.vrandecic@wikimedia.de> wrote:
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably reduce the number of web requests (for https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40 requests).
Note that this could even be less if I added the Wikipedia chrome (logo, icons etc.) as data URLs. However, since these should be in local cache after the first page view, it would be quite wasteful to transmit them on each page view again, as base64 no less.
Looks good. Makes me wish H.264 wasn't so loaded down with patents. Then we'd maybe have an even better codec.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com
On Mon, Apr 22, 2013 at 9:38 AM, Magnus Manske magnusmanske@googlemail.comwrote:
On Mon, Apr 22, 2013 at 2:17 PM, Denny Vrandečić < denny.vrandecic@wikimedia.de> wrote:
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably reduce the number of web requests (for https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40 requests).
Note that this could even be less if I added the Wikipedia chrome (logo, icons etc.) as data URLs. However, since these should be in local cache after the first page view, it would be quite wasteful to transmit them on each page view again, as base64 no less. _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
wikitech-l@lists.wikimedia.org