[gst-devel] Implementing support for Apple HTTP Live Streaming...
Stephen Buck
stephenbuck at mac.com
Wed Jun 9 19:30:09 CEST 2010
I'm currently implementing a GStreamer plugin to handle video playback using Apple's HTTP Live Streaming protocol (http://en.wikipedia.org/wiki/HTTP_Live_Streaming), which is used by the iPhone and iPad, and need some advice on the best way to proceed. The protocol works roughly like this:
1) A client (GStreamer) opens an HTTP URI that points to an .m3u8 (mime type = application/vnd.apple.mpegurl) file. The m3u8 file contains an ordered list of URIs that point to small mpeg TS files (mime type = video/mpegts) that contain small fragments of the entire video. Something like:
...
#EXTINF:10, first 10 sec fragment
http://localhost/myvideo/fileSequence0.ts
#EXTINF:10, second 10 sec fragment
http://localhost/myvideo/fileSequence1.ts
#EXTINF:10, third 10 sec fragment
http://localhost/myvideo/fileSequence2.ts
#EXTINF:10, fourth 10 sec fragment
http://localhost/myvideo/fileSequence3.ts
…
2) The client plays the video by reading each of the fragment URIs and pushing them downstream to a pipeline that knows how to demux and decode the mpeg TS content. A typical pipeline might look like this:
souphttpsrc location=http://localhost/myvideo.m3u8 ! livestreamdec ! queue ! mpegtsdemux ! queue ! ffdec_h264 ! queue ! ffmpegcolorspace ! queue ! xvimagesink
or, even simpler by using playbin:
playbin uri=http://localhost/myvideo.m3u8
3) In the case of a fixed length video, the m3u8 contains URIs for all of the fragments that make up the video. In the case of live video, the m3u8 contains URIs for a relatively small window of the live stream and must be re-reread when the playback progresses past the last URI in the m3u8.
I've got many of the pieces working, but I've had to use a few hacks and I'm certain I have not done it the proper way. My design is currently a decoder that extends GstBaseSrc but adds on a single sink pad for the m3u8 mime type. It waits for EOS on the sink pad, but does not pass it downstream. It then parses the m3u8 and starts a task on the src pad that opens the mpeg TS URIs and pushes them downstream. Here are the main problems:
1) The URIs of the mpeg TS files can be relative to the URI of the containing m3u8, which I don't seem to be able to get from souphttpsrc.
2) I have not figured out how to get souphttpsrc to re-read the contents of its assigned URI.
3) My decoder is using libsoup to get the contents of the mpeg TS files, but this doesn't integrate well with the proxy settings, session, etc. used by souphttpsrc. It would be nice if I could use my upstream souphttpsrc to read these child URIs as well.
4) I'm not really sure if I should be a source (seems like a bad idea for compatibility with playbin, but the base functionality is nice), a decoder (decodebin seems to look for this in my element class name), a demuxed, or something else.
The advanced form of this protocol supports dynamic bit-rates by pointing the child URIs to other m3u8 files that have the format described above. There is one child URI each bit-rate encoding of the video and the client can switch between them depending on network qos. Once the above works, this part should be pretty easy.
I'm fairly new to gstreamer and have made a lot of progress, but I'm having trouble with these issues, so any ideas or advice are welcome.
More information about the gstreamer-devel
mailing list