On Fri, Apr 24, 2009 at 1:22 PM, Roan Kattouw roan.kattouw@gmail.com wrote:
The problem here seems to be that thumbnail generation times vary a lot, based on format and size of the original image. It could be 10 ms for one image and 10 s for another, who knows.
Is it really necessary for any image to take 10s to thumbnail? I guess this would only happen for very large images -- perhaps we could make sure to cache an intermediate-sized thumbnail as soon as the image is uploaded, and then scale that down synchronously on request, which should be fast. Similarly, if specific image features (progressive JPEG or whatever) make images much slower to thumbnail, an intermediate version can be automatically generated on upload without those features. Of course you'd see a little loss in quality from the double operation, but it seems like a more robust solution than trying to use JavaScript.
I'm not an expert on image formats, however, so maybe I'm misunderstanding our options.
AFAICT this isn't about optimization, it's about not bogging down the Apache that has the misfortune of getting the first request to thumb a huge image (but having a dedicated server for that instead), and about not letting the associated user wait for ages.
"Not letting the associated user wait for ages" is called "making it faster", which I'd say qualifies as optimization. :)
Even worse, requests that thumb very large images could hit the 30s execution limit and fail, which means those thumbs will never be generated but every user requesting it will have a request last for 30s and time out.
max_execution_time applies only to the time that PHP actually spends executing. If it's sleeping on a network request, it will never be killed for reaching the max execution time. Try running this code:
ini_set( 'max_execution_time', 5 ); error_reporting( E_ALL | E_STRICT ); ini_set( 'display_errors', 1 );
file_get_contents( 'http://toolserver.org/~simetrical/tmp/delay.php?len=10' );
echo "Fetched long URL!";
while ( true );
It will fetch the URL (which takes ten seconds), then only die after the while ( true ) runs for about five seconds. The same goes for long database queries, etc. I imagine it uses the OS's reports on user/system time used instead of real time.
Plus, the idea is apparently for this to not be done by the server at all, but by the client, so there will be no latency for the overall page request anyway. The page will load immediately, only the images will wait if there's any waiting to be done.
On Fri, Apr 24, 2009 at 1:46 PM, Brion Vibber brion@wikimedia.org wrote:
One suggestion that's been brought up for large images is to create a smaller version *once at upload time* which can then be used to quickly create inline thumbnails of various sizes on demand. But we still need some way to manage that asynchronous initial rendering, and have some kind of friendly behavior for what to show while it's working.
That's what occurred to me. In that case, the only possible thing to do seems to be to just have the image request wait until the image is thumbnailed. I guess you could show a placeholder image, but that's probably *less* friendly to the user, as long as we've specified the height and width in the HTML. The browser should provide some kind of placeholder already while the image is loading, after all, and if we let the browser provide the placeholder, then at least the image will appear automatically when it's done thumbnailing.