<div dir="ltr">Without a working example, I still believe the A/V sync could be done with the element "rtpbin" with multiple udpsrc elements.<div><br></div><div>Regards,</div><div><br></div><div>Yu </div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, 13 Jun 2022 at 09:22, Florian Echtler via gstreamer-devel <<a href="mailto:gstreamer-devel@lists.freedesktop.org">gstreamer-devel@lists.freedesktop.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hello Abrar,<br>
<br>
having tried the exact same thing without much success some while ago, I would <br>
recommend to just use WebRTC (i.e. the webrtcbin element) for synchronized A/V <br>
streaming. But it obviously depends on your usecase whether that is feasible.<br>
<br>
Best, Florian<br>
<br>
On 13.06.22 01:45, Abrar Shahriar via gstreamer-devel wrote:<br>
> Hi,<br>
> <br>
> I will synchronize microphone data from a sender via UDP src sink with video <br>
> data coming from a camera, but not sure what the best way is.<br>
> <br>
> Right now, I take timestamps (PTP) at mic and video src plugin src pads and add <br>
> it to RTP header metadata.<br>
> <br>
> My receiver app receives data via udpsrc, has a pipeline for video and one for <br>
> audio. It can read the timestamps at sink pad of RTP depay elements.<br>
> <br>
> What can I do to make sure audio is within 30ms of video, for say proper lipsync <br>
> and such.<br>
> <br>
> Also will want to adjust audio video offset at runtime to compensate for display <br>
> latency, etc.<br>
> <br>
> Thanks,<br>
> Abrar<br>
> <br>
> シャハリアル<br>
> <br>
> ーーーーーーーーー<br>
> <br>
> アブラル・ザヒン・シャハリアル(Abrar Zahin Shahriar)<br>
> <br>
> ハイパーダイン株式会社<br>
> 〒108-0014 東京都港区芝5丁目9-12 3階<br>
> <br>
<br>
<br>
-- <br>
SENT FROM MY DEC VT50 TERMINAL<br>
</blockquote></div>