[Foundation-l] Alternative approach for better video support

Daniel Arnold arnomane at gmx.de
Mon Jul 23 15:44:51 UTC 2007


On Monday 23 July 2007 15:48:24 Anthony wrote:
> Well, you suggest that cost of bandwidth is the reason, but that
> contradicts what the developers have been consistently saying - that
> there is no bandwidth problem.

Well according to my rough estimate the Wikimedia media repository has a size 
of 500 GB.

Now assume one can download that with his new 10Mbit/s at home DSL line at 
full speed: You'd need almost 5 days until the download is complete (and 
although you might have a flatrate, I am pretty sure your ISP will have some 
fair flat policy and makes some troubles sooner or later).

Now let us assume a somewhat more likely download rate of 200kB/s (you know 
concurrent downloaders and other things). You'd need 30 days for a complete 
download.

You're mixing seveal things: Wikimedia has no bandwith problem for its 
_current_ operation but you shouldn't forget that bandwith alone is only the 
half trouth. There is transfer size as well and transfer size costs money 
right now. Money from many many donators. If we reject a material donation 
that reduces our transfer costs and thus makes better use of our money 
donations we'd be pretty stupid and furthermore we'd waste donated money for 
no good reason.

So if we're going to provide video and audio streaming on a large scale we 
will have greatly increased transfer costs.

> Yes, you present one solution to a problem which the foundation says
> it doesn't have.

Sorry but you mix up several things. See above. If the Foundation wouldn't 
have a problem with media file database downloads and large scale video 
streaming you'd implicitely assume that they are lazy people that just don't 
want to provide the media repository for some evil reasons...

> There are lots of other solutions too, once we get 
> past the step of identifying exactly why there is no tarball for all
> commons images.

Sorry to be so unfriendly but you simply have no clue. HTTP and FTP simply 
weren't designed for downloading moving targets in the size of 500 GB! Rsync 
is not just a random geek software none else uses. Rsync exists for a very 
good reason: Use cases like our, namely sncronisation of huge amounts of 
constantly changing data between a master and a mirror slave.

My suggested solution (I can only recomend reading the initial post again) is 
an approach to solve both the streaming issue and the media download without 
even requiring usual Wikipedia readers to use "new" software like Rsync.

> Who would be a good central point of contact at the WMF for answering this?

These people are reading here (most times carefully ;-).

Arnomane
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part.
Url : http://lists.wikimedia.org/pipermail/foundation-l/attachments/20070723/3a4dca80/attachment.pgp 


More information about the foundation-l mailing list