rui.luis at gmail.com
Fri Oct 28 02:16:29 PDT 2011
Sory for long text.. is to give context..
I have the following problem. I want to display a (video/sound) and i want
to do background processing on the video pipeline. For that, i created two
solutions, one using a single pipeline, and another using two pipelines.
The one using a single pipeline, uses a uridecodebin at start, separating
video and sound, and then uses a tee to "create" two new pipelines for video
that feed two appsink, therefore i have one sound pipeline and two video
pipeline, and at one of the video pipeline i can do image processing.
However in this case i want to delay the video display/ sound pipeline, i
want this combined pipeline to be delay, for instance 10 second or x frames,
to the processing pipeline. I try to alter the buffer time stamp to display
the video/audio with a delay, but it didn't work.
In the two pipelines solution, i launch two uridecodebin, one for video
display/ sound, and another just for video processing. In this case if i use
the latency property in a rtsp stream for the video display/sound pipeline
it works. However i want to use uridecodebin to be "input/protocol"
So my question is, how to add a delay in such cases? or if there is a better
solution than the ones that i explained here?
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/delay-pipeline-tp3947316p3947316.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
More information about the gstreamer-devel