[FFmpeg-devel] Apple HTTP Live Encoding Support
Fri Feb 25 23:48:44 CET 2011
Apple introduced HTTP Live several years ago. It's a streaming format
over HTTP that features adaptive bit rates and live streaming.
It's now the only Apple blessed way to stream video to an iOS device.
Android 3.0 will have HTTP Live support, and one flash player is
working on support.
The format itself is fairly simple. It essentially consists of a
manifest file that lists URLs for MPEG 2 Transport Stream files,
containing AAC audio and h264 video.
ffmpeg currently supports reading an HTTP Live Stream and decoding the
media. Apple distributes tools for encoding an HTTP Live Stream
(mediafilesegmenter, mediastreamsegmenter); these are closed source
and only run on MacOS X.
There are some open source stream segmenters out there; all originally
seem based on http://svn.assembla.com/svn/legend/segmenter/.
My company (Animoto) also now has a version based on the original
segmenter. I think that it's fair to say that each has different
I believe that it would be best for ffmpeg to support encoding HTTP
Live directly because:
* All of the open source stream segmenters use ffmpeg's code base
(libavformat) to chunk streams; they're currently all essentially very
stripped down versions of ffmpeg.
* Putting this into ffmpeg will centralize and "standardize" HTTP live
encoding with respect to the open source community, so bugs will be
flagged and fixed more easily.
* Encoding solutions will perform better. Rather than encoding an
mpeg 2 stream using ffmpeg and piping it to a separate program (the
segmenter) that must read the stream and encode HTTP Live, ffmpeg just
will encode the HTTP Live stream directly.
I certainly at least believe that ffmpeg should handle the stream
segmentation; creating the manifest file potentially could be
offloaded to a separate program/script.
I can start working on this, if people think that it's reasonable.
More information about the ffmpeg-devel