On Mar 14, 2014, at 12:02 PM, Brion Vibber <brion@pobox.com> wrote:
I don’t like the idea of an extra encoding pass of all content that becomes part of the wikimedia archive, I would strongly recommend we work with a partner to upload the original asset into an archive, and have the service re-encode an ingest, with a metadata mapping.
Maybe its less exciting then cross-compling an encoder into javascript ;) but we should think long term and work to preserve the original quality encodes, and in theory patents will expire, and or we will want to re-encod into whatever is free and popular at that point in time.
ffmpeg has been compiled to JS with a
nice little library package, but I don’t think its a good idea archive wise. Not to mention this is has much more patent exposure for the users. Instead of legally hosting two or three decoders on WMF, or using decoders that are already on users systems, we would be pushing out patented separate software runtimes for decoders to thousands of users ? Firefogg was already hosted on a seperate domain for this reason.
Thinking ahead, Imagine ORBX.js gains acceptance or Apple buys into Daala, as it delivers on the improvements they are targeting. We would then have 3 passes on all content, getting pretty distant from the H.264 captured by the device.
We are not dealing with mezzanine files, devices already have relatively aggressive low bitrates on consumer captured video assets.
The user should upload the source asset, the server should do the encoding.
Internet archive would be a good partner.
—michael