Syncing two live sources

Frederik devfrederik at
Thu Sep 20 17:11:01 UTC 2018

what is the best practice keeping two live sources in sync.
On the sender side, there is a camera producing h264 video and raw pcm
audio. The audio is opus encoded and both are send over udp ( using rtpbin
with a rtpsession for each source)
On the receiver side there's a rtpbin and rtpsession/rtpjitterbuffer for
each source. Both audio and video are decoded in to raw pcm and jpeg. At the
end, there's a app sink for each source.

How do I handle the timestamps on the receiver ? The sender will make Rtp
timestamps using the clock in  the pipeline I think ?
Can anyone explain ??

Sent from:

More information about the gstreamer-devel mailing list