On Sat, Aug 1, 2009 at 11:04 AM, Gregory Maxwell gmaxwell@gmail.com wrote:
On Sat, Aug 1, 2009 at 12:17 PM, BrianBrian.Mingus@colorado.edu wrote:
A reasonable estimate would require knowledge of how much free video can
be
automatically acquired, it's metadata automatically parsed and then automatically uploaded to commons. I am aware of some massive archives of free content video. Current estimates based on images do not necessarily apply to video, especially as we are just entering a video-aware era of
the
internet. At any rate, while Gerard's estimate is a bit optimistic in my view, it seems realistic for the near term.
So— The plan is that we'll lose money on every transaction but we'll make it up in volume?
There are always tradeoffs. If I understand wiki@home correctly it is also intended to be run @foundation. It works just as well for distributing transcoding over the foundation cluster as it does for distributing it to disparate clients. Thus, if the foundation encounters a cpu backlog and wishes to distribute some long running jobs to @home clients in order to maintain realtime operation of the site in exchange for bandwidth it could. Through this method the foundation could handle transcoding spikes of arbitrary size. In the case of spikes @foundation can do first pass get-something-back-to-the-user-now encoding and pass the rest of the tasks to @home.