Hoi, At the moment we have an upper limit of 100Mb. The people who do restorations have one file that is 680Mb.. The corresponding jpg is also quite big !! Thanks, GerardM
2009/4/24 Roan Kattouw roan.kattouw@gmail.com
2009/4/24 Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
: How long does it take to thumbnail a typical image, though? Even a parser cache hit (but Squid miss) will take hundreds of milliseconds to serve and hundreds of more milliseconds for network latency. If we're talking about each image adding 10 ms to the latency, then it's not worth it to add all this fancy asynchronous stuff.
The problem here seems to be that thumbnail generation times vary a lot, based on format and size of the original image. It could be 10 ms for one image and 10 s for another, who knows.
Moreover, in MediaWiki's case specifically, *very* few requests should actually require the thumbnailing. Only the first request for a given size of a given image should ever require thumbnailing: that can then be cached more or less forever.
That's true, we're already doing that.
So it's not a good case to optimize for.
AFAICT this isn't about optimization, it's about not bogging down the Apache that has the misfortune of getting the first request to thumb a huge image (but having a dedicated server for that instead), and about not letting the associated user wait for ages. Even worse, requests that thumb very large images could hit the 30s execution limit and fail, which means those thumbs will never be generated but every user requesting it will have a request last for 30s and time out.
Roan Kattouw (Catrope)
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l