Simple AV pipeline stuck in prerolling state (mp4/h264/aac)

Tim-Philipp Müller t.i.m at zen.co.uk
Mon Nov 19 02:52:43 PST 2012


On Mon, 2012-11-19 at 15:47 +0530, Mandeep Sandhu wrote:
Hi,

> 
> I have pretty basic question. I'm trying to create a simple pipeline
> for playing an mp4 (h264/aac) file as follows:
> 
> $ gst-launch-0.10 filesrc location=/path/to/video.mp4 ! qtdemux
> name=qtd ! ffdec_h264 ! autovideosink async-handling=true qtd. !
> ffdec_aac ! autoaudiosink
> 
> Setting pipeline to PAUSED ...
> Pipeline is PREROLLING ...

You need queues for each branch after qtdemux (decodebin will plug a
multiqueue; note that the default size of 'queue' may be too small).

The reason for this is the preroll mechanism. Sinks will block once they
have received a buffer. The entire pipeline will only preroll / go to
PAUSED state once all sinks have a buffer prerolled. If you don't have
queues, the demuxer will push a packet out on one pad, the decoder will
output a packet, the sink will preroll and block. This will block the
demuxer thread, since there's no queue, so the demuxer can never push a
buffer to the other branch, so that sink will never preroll.

Cheers
 -Tim

> This opens up a window to show the first frame of the video and
> remains paused there. If the audio sink is removed from the pipeline,
> then the video plays fine.
> 
> I know I can use other bins like playbin/decodebin2 to play the file,
> but I'm trying out individual elements for my own understanding's
> sake.
> 
> I understand that PREROLLING means that the pipeline is 'preparing'
> itself for processing the data. So what is stopping the audio sink
> from getting it (I presume thats the element causing the pipeline
> stall)?
> 
> I have read on some forums that setting "async-handling=true" in the
> video sync will fix this problem. However that did not help (or maybe
> I'm not setting it at the right place).
> 
> Also, this is slightly unrelated, the syntax for setting up a pipeline
> (with individual elements) for audio AND video seems a little weird.
> The way to do it, as I understand from other people's examples, is to
> give a name to the demux element and then specify that name in the
> video sink with a dot suffix and then construct the audio pipeline
> from there. Is this how it's supposed to be done? Is there a doc which
> explains this more clearly?
> 
> Thanks for your time.




More information about the gstreamer-devel mailing list