<div dir="ltr"><span class=""></span><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><span class="">
</span>A plausible cause is that your sockets (udpsrc) have to small buffers<br>
to be able to hold on a frame, while this frame is being processed. You<br>
could try and increase the buffer-size property (it's by default 0,<br>
which is system default, you can also increase you system default if<br>
you prefer).<br>
<br>
For the non-v4l2src case, you probably have timestamp issues.<br></blockquote><div><br></div><div>So I've tried most of the combinations now. I used a different machine as the client this time, earlier I was streaming out and in from the same machine, which doesn't work out correctly in case of multicast. I connected just two of them to a Gigabit router via GIgabit ethernet cables to ensure there's no packet loss due to network traffic. I used these pipelines on the source machine - <br><br><b>gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! videoconvert ! video/x-raw, height=576,
width=720, format=UYVY ! rtpvrawpay ! udpsink host=224.1.1.1 port=5004<br><br></b><br><b>gst-launch-1.0 filesrc location=test_video.mp4 ! decodebin ! videoconvert ! videoscale ! video/x-raw, height=576,
width=720, format=UYVY ! rtpvrawpay ! udpsink host=224.1.1.1 port=5004<br></b><br></div></div>On the client machine, I played them using:<br><br><b>gst-launch-1.0 udpsrc address=224.1.1.1 port=5004 ! "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)720, height=(string)576, colorimetry=(string)BT601-5, payload=(int)96" ! rtpvrawdepay ! videoconvert ! autovideosink<br><br></b></div><div class="gmail_extra">(using rtpjitterbuffer also)<br><b>gst-launch-1.0 udpsrc address=224.1.1.1 port=5004 !
"application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2,
depth=(string)8, width=(string)720, height=(string)576,
colorimetry=(string)BT601-5, payload=(int)96" ! rtpjitterbuffer ! rtpvrawdepay !
videoconvert ! autovideosink<br><br></b></div><div class="gmail_extra"><b>Results:<br></b></div><div class="gmail_extra">1. When I don't use rtpjitterbuffer, there are no lacerations in the video or video tearing, but the sink displays a still frame in 70% of the window, and a continuous video stream on 30% of the sink window. I thought that was a sink issue and tested it out on a custom appsink, which converts it to RGB and renders it onto a Qt display. This improved the video a lot. I got a continuous stream with minor data loss which is tolerable I think in case of RTP/UDP, but I'm not entirely sure if even that should be there. This was the best result that I got. <br><br></div><div class="gmail_extra">2. There is a lot of video tearing (maybe because of data loss) using rtpjitterbuffer. It plays for some time but eventually stops the stream completely.<br><br>3. Then I tried increasing buffer-size of udpsrc to 100, 1000, 50000 but I'm getting green lines on most of the video. <br><br>Is there anything wrong in the pipelines I'm using? Is there a working example of multicast send and receive using rtpvrawpay/depay or using sdp file? Appreciate any help on this!<br><br></div><div class="gmail_extra">Thanks,<br></div><div class="gmail_extra">Anuj<br></div></div>