The gzipping does on the bulk of user agents pretty drastically reduce the bandwidth consumed for non-thumbnail resources.
To follow up on our face-to-face discussions, would it be possible to do both? That is:
1. Always pare down the thumbnail file size. Thus, cut down initial page footprint for everyone without sacrificing thumbnails altogether (i.e., without requiring the user to have to tap on a "click to view" link) on low-JS devices. 2. If the user is on a non-zero-rated network and has higher-JS support, trigger the image retrieval for the more bandwidth intensive image when the user nears the thumbnail. Because step #1 saves bandwidth, the impact of #2 on bandwidth consumption is in effect minimized.
-Adam
On Mon, Jun 9, 2014 at 5:01 PM, Jon Robson jdlrobson@gmail.com wrote:
I'd rather we didn't serve poor quality images to all our users. This is a poorer user experience in my opinion. if I understand correctly this change was more to encourage more providers to join zero by providing an incentive of Zero users using less data. I'm still not convinced there is a huge benefit to the user themselves when you consider gzipping etc. Have you benchmarked and documented how this change effects load time? Pulling in Ori and Aaron since they should have expertise in this area.
A lot of browsers these days allow you to turn off images altogether and I think if a user you'd rather do this then receive poorer quality images. To me it's a binary switch - no images or images... On retina displays we actually go the opposite direction and pull in better quality images. I would hazard a guess that the issue here is the number of http requests rather than the size of the images.
I think if we wanted to invest any time in this sort of thing we should explore deferring the load of images until they are visible (we dabbled with this when we explore lazy loading sections) [1]. It would be interesting to rewrite any image after the first heading to be a link to the image and pull it in via JavaScript when it is scrolled into view. I think this would give us more bang for our buck...
On a side notice I notice all images on mobile are missing a cache expiration - is that intended? Also have we considered adding a header Cache-Control: public to them?
[1] http://24ways.org/2010/speed-up-your-site-with-delayed-content/ On 9 Jun 2014 11:45, "Tomasz Finc" tfinc@wikimedia.org wrote:
Thanks Yuri,
CC'ing Multimedia team
Maryana, this could be something interesting for the Mobile Web team to look at to optimize image delivery.
Have you guys done any perf work around images?
--tomasz
On Thu, Jun 5, 2014 at 4:10 PM, Yuri Astrakhan yastrakhan@wikimedia.org wrote:
The reduced quality images is now live in production. To see it for yourself, compare original with low quality images (253KB => 99.9KB, 60% reduction).
The quality reduction is triggered by adding "qlow-" in front of the
file
name's pixel size.
Continuing our previous discussion, now we need to figure out how to
best
use this feature. As covered before, there are two main approaches:
- JavaScript rewrite - dynamically change <img> tag based on
network/device/user preference conditions. Issues may include multiple downloads of the same image (if the browser starts the download before
JS
runs), parser cache fragmentation.
- Varnish-based rewrite - varnish decides which image to server under
the
same URL. This approach requires Varnish to know everything needed to
make a
decision.
Zero plans to go the first route, but if we make it mobile, or ever site wide, all the better.
Mobile-l mailing list Mobile-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mobile-l
Mobile-l mailing list Mobile-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mobile-l
Mobile-l mailing list Mobile-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mobile-l