Best Way to Synchronize Realtime Audio streams with Video Streams?
Florian Echtler
floe at butterbrot.org
Mon Jun 13 06:22:27 UTC 2022
Hello Abrar,
having tried the exact same thing without much success some while ago, I would
recommend to just use WebRTC (i.e. the webrtcbin element) for synchronized A/V
streaming. But it obviously depends on your usecase whether that is feasible.
Best, Florian
On 13.06.22 01:45, Abrar Shahriar via gstreamer-devel wrote:
> Hi,
>
> I will synchronize microphone data from a sender via UDP src sink with video
> data coming from a camera, but not sure what the best way is.
>
> Right now, I take timestamps (PTP) at mic and video src plugin src pads and add
> it to RTP header metadata.
>
> My receiver app receives data via udpsrc, has a pipeline for video and one for
> audio. It can read the timestamps at sink pad of RTP depay elements.
>
> What can I do to make sure audio is within 30ms of video, for say proper lipsync
> and such.
>
> Also will want to adjust audio video offset at runtime to compensate for display
> latency, etc.
>
> Thanks,
> Abrar
>
> シャハリアル
>
> ーーーーーーーーー
>
> アブラル・ザヒン・シャハリアル(Abrar Zahin Shahriar)
>
> ハイパーダイン株式会社
> 〒108-0014 東京都港区芝5丁目9-12 3階
>
--
SENT FROM MY DEC VT50 TERMINAL
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 203 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220613/54c76dd9/attachment-0001.sig>
More information about the gstreamer-devel
mailing list