Best way to concatenate and play streams
Ralph
ralph.gucwa at racelogic.co.uk
Thu Mar 21 09:30:14 PDT 2013
Normally you don't need any queues.
You need two separate gnlcomposition chains in your pipeline: one for audio,
another for video. You can't mix video and audio in one gnlcomposition
object. This is how the whole pipeline shoul look like:
gnlfilesource - gnlcomposition - audioconvert - audioresample - volume -
autoaudiosink
gnlfilesource /
... /
gnlfilesource - gnlcomposition - ffmpegcolorspace - autovideosink
gnlfilesource /
... /
You need to link each gnlcomposition object with
autoaudiosink/ffmpegcolorspace when a new file has been decoded and dynamic
audio/video pad has been created.
For example for the video chain subscribe to PadAdded event when creating
the pipeline (the code in C#):
videoComposition.PadAdded += new
Gst.PadAddedHandler(OnVideoPadAdded);
videoComposition.PadRemoved += new
Gst.PadRemovedHandler(OnVideoPadRemoved);
When the first file is opened and a dynamic video pad is created on the
video gnlcomposition object, connect ffmpegvideospace with gnlcomposition:
private void OnVideoPadAdded(object sender, Gst.PadAddedArgs args)
{
Gst.Element.Link(videoComposition, ffmpegColorSpace);
}
private void OnAudioPadRemoved(object sender, Gst.PadRemovedArgs
args)
{
Gst.Element.Unlink(videoComposition, ffmpegColorSpace);
}
You need to do the same with the audio chain.
I'm pretty busy now and I don't have time to analyse your code, hopefully
this should be enough information for you to get it working.
--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Best-way-to-concatenate-and-play-streams-tp4659182p4659219.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
More information about the gstreamer-devel
mailing list