<div dir="ltr"><div dir="auto">Hi, thank you for the prompt response.<div dir="auto">I wasn't specific enough, the application will run on a LAN, so I can use multicast in order to send only one stream from each PC. <br><div dir="auto"><div dir="auto">With WebRTC multicast is not possible, because it requires a separate stream for each receiver.</div></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, May 19, 2021, 15:02 Mathieu Duponchelle <<a href="mailto:mathieu@centricular.com" target="_blank">mathieu@centricular.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
Hey,<br>
<br>
I would recommend using webRTC (webrtcbin in the GStreamer context)
for<br>
such an application :)<br>
<br>
Best,<br>
<br>
<div>-- <br>
Mathieu Duponchelle · <a href="https://www.centricular.com" rel="noreferrer" target="_blank">https://www.centricular.com</a></div>
<br>
<div>On 5/19/21 8:14 AM, Gregory AE40 via
gstreamer-devel wrote:<br>
</div>
<blockquote type="cite">
<div dir="ltr">Hi,
<div><br>
</div>
<div>I am trying to implement a video chat using GStreamer on
Windows 10. I intend to use gstreamer C API, but for now I am
just using the command line for basic testing. In order to
keep video and audio in sync, I am muxing them using mpegtsmux
and then sending them via RTP. It works fine, but the latency
is pretty high (I think it's about 1 second).
<div><br>
</div>
<div>My sending pipeline is as follows:</div>
<div> </div>
<div>gst-launch-1.0 mpegtsmux name=mux ! rtpmp2tpay ! udpsink
host=127.0.0.1 port=5555^<br>
wasapisrc ! audioresample ! audioconvert ! avenc_aac !
queue ! mux.^<br>
ksvideosrc ! videoconvert ! x264enc tune=zerolatency !
h264parse config-interval=-1 ! queue ! mux.^<br>
<br>
</div>
<div>My receiving pipeline is as follows:</div>
<div><br>
</div>
<div>gst-launch-1.0^<br>
udpsrc address=127.0.0.1 port=5555 caps="application/x-rtp"
! rtpmp2tdepay ! tsdemux name=demux^<br>
demux. ! queue ! avdec_aac ! audioresample ! audioconvert !
wasapisink low-latency=true^<br>
demux. ! queue ! h264parse ! avdec_h264 ! videoconvert !
autovideosink<br>
</div>
<div><br>
</div>
<div>Questions:</div>
<div>1. Is this the correct approach for a video chat or
should I use separate streams for audio and video?</div>
<div>2. If this is the correct approach, am I using the
correct muxer or should I use something else such as asfmux?</div>
<div>3. If this is the correct approach and the correct muxer,
is there something I can do in order to reduce the latency?</div>
<div><br>
</div>
<div><br>
</div>
</div>
</div>
<br>
<fieldset></fieldset>
<pre>_______________________________________________
gstreamer-devel mailing list
<a href="mailto:gstreamer-devel@lists.freedesktop.org" rel="noreferrer" target="_blank">gstreamer-devel@lists.freedesktop.org</a>
<a href="https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" rel="noreferrer" target="_blank">https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a>
</pre>
</blockquote>
</div>
</blockquote></div>