Confused about gstreamer rtpjitterbuffer behavior w.r.t "latency" parameter
ajit.warrier at gmail.com
Fri Nov 10 22:36:21 UTC 2017
I am trying to fully understand how the rtp jitterbuffer works. I have a
simple pipeline getting rtp audio packets from a udpsrc which is then
processed by an rtpbin element and sent to the alsa sink.
I set the "latency" parameter of the rtpbin to 5s. I understand it maps to
the enclosed rtpjitterbuffer element. Now when the stream first starts,
there is a 5 second delay before the audio starts playing.
Now I stop the udp sender for a few seconds, and then start it again. At
this time, there is no delay, and the pipeline starts playing immediately.
However, if I stop the udp sender now, there is a 5 second silence after
the sender has stopped. This behaviour remains for any subsequent
start/stop of the sender.
Can anybody explain both of these behaviours ?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gstreamer-devel