Synchronizing separate audio and video streams based on rtp timestamp

Tim Müller tim at centricular.com
Tue Aug 27 13:08:50 UTC 2019


On Mon, 2019-08-26 at 18:36 +0000, Wedge, Ryan J. (JSC-CD4)[SGT, INC]
wrote:

Hi Ryan,

> We’ve encountered a non-standard synchronization issue in a new
> system. A platform running GStreamer will receive two multicast
> streams, one for audio and another for video. The audio stream will
> only be present while there is actual intelligible audio. A concern
> is that the audio may be encoded with much less delay than the video
> and received by our platform unsynchronized, such that the audio will
> be out of sync with the video when output to speakers/monitor.  
>  
> I’m hoping someone with more knowledge of GStreamer’s limitations and
> capabilities can provide input on whether this is a trivial problem
> that can be solved using an existing GStreamer 1.x deployment.

This is a fairly standard problem in the context of RTP streaming, and
there should be a standard solution that fits your streaming scenario.

Usually the sender would also send Sender Report (SR) RTCP packets to
receivers, and these packets will map the RTP timestamps for each
stream to a common NTP timebase. GStreamer's rtpbin will automatically
take care of synchronising the streams in a GStreamer context in that
case.

If you don't have RTCP packets, there are multiple RTP header
extensions that will carry a common reference time and that can then be
used for the purpose of synchronising streams.

Or the sender can make an SDP available with a mapping.

It all depends a bit on the details, but it's a standard problem and
you should be able to use a standard solution.

Do you have control over the sender as well? Is it GStreamer?

If it's an option I would recommend using RTSP to setup the streaming
(the actual data can then still be sent via UDP multicast).

Cheers
 Tim

-- 
Tim Müller, Centricular Ltd - http://www.centricular.com



More information about the gstreamer-devel mailing list