Porting the hlsdemux work in issue 698155 to 1.x

Arnaud Vrac rawoul at gmail.com
Mon Dec 2 02:38:44 PST 2013


On Sun, Dec 1, 2013 at 7:52 PM, Andoni Morales <ylatuya at gmail.com> wrote:
>
> 2013/11/29 Arnaud Vrac <rawoul at gmail.com>
>>
>>
>> While some of the patches would be welcome to be ported to gstreamer 1.x,
>> I think the choices Andoni made to support multiple renditions are not best
>> suited to the gstreamer architecture. Ideally I think all the possible
>> tracks should exposed as pads, instead of having to change a demux property
>> to switch audio or subtitle tracks. The issue with the proper design is
>> that you need a stream selection API, which has been talked about some time
>> before. Without this API, all streams would be downloaded at the same time
>> when used in a decodebin environment.
>>
>
> I see it in a completely different way, having to expose all the pads is a
> hack to workaround the lack of a proper stream selection API. If a stream
> has Video,  Audio and Text, the element should only expose 3 pads and the
> stream selection API should provide a way to select which alternative
> stream you want to activate (like a different bitrate or a different
> language) , similar to what decodebin does with properties and what the
> element does right now, making it easier to port it to this future API.
>
> Exposing all pads for the audio alternatives, means you also have to
> expose pads for each video bitrate (or for each encoding type like it might
> happen in DASH when you have not only several languages, but also several
> encoding options like AAC or AC3). The other issue with exposing one pad
> for each stream configuration is that you could potentially connect to any
> of the pads and expect to get data, which is not the case as you have
> cached content, and  you could also try to download the same video stream
> for different bitrates and audio languages at the same time,  making it
> quite hard to manage internally, which is by the way a use case that's not
> described in the spec and shouldn't even be considered for this element.
>

I think it would be confusing to mix renditions of multiple streams.
Normally the currently playing stream is _only_ determined by the demuxer,
based on the available bandwidth. The multiple streams should not be
exposed to the user, he should only be able to force the available
bandwidth (which is already possible with the connection-speed property).
So if the demuxer decided to play the low bitrate stream with audio only,
downstream shoudn't see any video pad. You wouldn't be able to play it
properly anyway because of bandwidth limitations.

With that in mind, all renditions of the currently playing stream can be
exposed as pads. The problem is that you only want to download the data of
the renditions that are currently playing, which is not possible with the
current decodebin.

Does it make sense ?

-- 
Arnaud
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20131202/c6bc89d6/attachment-0001.html>


More information about the gstreamer-devel mailing list