queue pipeline and bin setup question
t.i.m at zen.co.uk
Sat Dec 22 03:27:41 PST 2012
On Fri, 2012-12-21 at 12:29 -0800, mattes wrote:
> I have the following that works with gst-launch:
> v4l2src ! videorate ! queue !x264enc name=venc \
> alsasrc ! queue ! audioconvert ! faac name=aenc \
> qtmux name=mux ! filesink location=media.mp4 \
> aenc. ! mux. venc. ! mux.
> I managed to get this running with the following code:
> vsourcE, vcapsE, vqueuE, vencE,
> nqueuE, asourcE, aqueuE, aconvertE, aencE,
> mqueuE, muxerE, sinkE, NULL);
> gst_element_link_many( vsourcE, vcapsE, vqueuE, vencE, muxerE, NULL );
> gst_element_link_many( asourcE, aqueuE, aconvertE, aencE, muxerE, NULL );
> gst_element_link( muxerE, sinkE );
Note that there's also gst_parse_launch() and
gst_parse_bin_from_description() for convenience.
> I checked with the Gstreamer manual and read about bin, pipeline and some
> questions arise.
> What is the advantage having the video chain (v4l2src -> video encoder)
> and the audio chain (alsasrc -> audio encoder) to run inside a bin?
> (sorry 'chain' might not be the correct term !!)
> Are there performance advantages?
Chain is fine, I think.
In your pipeline above, there is only one bin, and that is the top-level
So the answer to your question might be: because there must always be a
top-level GstPipeline (which derives from GstBin). Elements get added to
You can also have normal non-pipeline bins that are not at the
top-level. Those are often used to group elements into easier to manage
units. But that's not the case here as far as I can tell.
> I was struggeling to express this "aenc. ! mux. venc. ! mux."
> into c-code until I came to realize that the above C-lines just do fine.
> Is that the correct way linking the audio and video chain?
> Even though it seems to work, are there any advantages doing it differently?
Not really. Maybe clarity in the code (you could explicity get the right
request pad by template name). Less noise in the debug log.
> Do the above c lines automatically setup a 'queue' for each
> chain (video, audio and muxer) ?
There are explicit queue elements for each chain in the pipeline above,
so yes, but there's nothing 'automatic' about it.
The reason to have a queue before the encoders here is to make sure the
sources are free to capture video/audio. Encoders might take a bit too
long at times and block the sources for a bit too, and then things might
not be captured. (This could still happen of course, but at least
there's some buffering now, so as long as the system is capable to keep
up overall it should be fine, spikes in encoder cpu usage won't hurt).
> I was search for examples and I found decoding examples, suggesting this:
> gst_element_link_many(audio_queue, audio_dec, audio_sink, NULL );
> Trying this myself:
> gst_element_link_many( nqueuE,asourcE,aqueuE,aconvertE,aencE,muxerE,NULL);
> and I get an error false returned.
> Can't you use a queue element in front of a source element?
No, the order of elements to the _link() function matters.
Sources only have a source pad, so you can't have anything 'in front of'
them. Sources like this spawn their own streaming thread to push data
(much like a queue does for its source pad).
The reason for queues after pipeline splits (like after a demuxer or tee
element) is a different one. It's required for the preroll mechanism to
work properly (where sinks block when they receive the first buffer, and
all sinks need a buffer for the pipeline to go to PAUSED state).
For an encoding pipeline like the above it's different, and the queues
are not strictly needed, just make things work more smoothly.
Btw, there's also a camerabin2 (camerabin in 1.0) element that takes
care of audio/video capture and can record to file on demand, take
snapshots, show a preview, etc.
More information about the gstreamer-devel