synchronize two live sources with an offset

Stepan Salenikovich stepan.salenikovich at gmail.com
Wed Sep 16 23:32:17 UTC 2020


Hi,
I'm trying to understand what is the correct way to synchronize two live
sources when one of them may (or may not) start with an offset.
In my specific case, audio and video is being captured from one device.
However the initial video frame might not always be output immediately; it
is only created when something changes on the device, so its possible to
start receiving audio before video.

My pipeline currently looks something like this:

appsrc is-live=true do-timestamp=true block=true \
! h264parse disable-passthrough=true config-interval=-1 \
! queue \
! mp4mux name=mux max-raw-audio-drift=50000000000
interleave-time=50000000000 faststart=true fragment-duration=100 \
! appsink wait-on-eos=true \
alsasrc device=<device> ! audio/x-raw,channels=2 |
! queue ! audioconvert ! audioresample ! audiorate tolerance=500000000 \
! fdkaacenc perfect-timestamp=true ! audio/mpeg,mpegversion=4 \
! mux.audio_1

When the audio and video both come in at the same time, they are synced.
But when the video starts with a delay w.r.t. the audio, then the resulting
mp4 seems to have that delay as the offset between the audio and the video;
ie: it will play as if the video was supposed to start at the same time as
the audio.

Thanks!
-stepan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200916/2fa3cbf0/attachment.htm>


More information about the gstreamer-devel mailing list