On Fri, Feb 22, 2008 at 3:50 PM, Magnus Manske magnusmanske@googlemail.com wrote:
On Fri, Feb 22, 2008 at 2:40 PM, Jim Hu jimhu@tamu.edu wrote:
On Feb 22, 2008, at 8:26 AM, David Gerard wrote:
On 22/02/2008, Roan Kattouw roan.kattouw@home.nl wrote:
Jim Hu schreef:
Could this be done by one of those grid applications like SETI at home? Or would the bandwidth usage make it not worth the benefit? I bet a lot of Wikipedia users would install a screensaver that did image resizing for you.
Maybe people should just resize their images before they upload them.
This was a donated image dump. You have grossly missed the point.
and not answered the question. The grid application could download and return the images in bunches as tgz. Doesn't have to be one at a time.
Or, we could have one dedicated server for long-running jobs. Pre-generate the "usual" thumbnail sizes for each large image. Maybe generate smaller thumbnails from larger ones, or multiple in one go, so it won't have to load a large image 10 times for 10 thumbnails. Not sure what to do for "unusual" sizes, though. Could be a DOS attac vector.
Magnus
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Just checked and there are 51811 too large PNGs on Commons with a total size of 44 GB. It takes hemlock about 30 seconds to generate a 1024px thumbnail from a large file. Assuming a server that is not crowded with 150 users, the minimum is probably around 20 seconds in the best case. It would therefore take approximately between the 2 and 3 weeks to generate thumbanils from all those files. Those files could then be used to create further smaller thumbnails from.
Bryan