Hi,<br><br>I'm trying to build a pipeline that does live rtp streaming, and ran into a few problems.<br><br>The pipe on the server side is:<br>gst-launch v4l2src ! mpeg4_encoder ! queue ! rtpmp4vpay ! udpsink host=othercomputer
<br>where mpeg4_encoder is a plugin I wrote that does, well, encoding to mpeg4.<br><br>The client pipeline:<br>gst-launch udpsrc caps="$caps" ! queue ! rtpmp4vdepay ! ffdec_mpeg4 ! xvimagesink sync=false<br>where
<br>caps="application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, ssrc=(guint)0000000000, clock-base=(guint)000000000, seqnum-base=(guint)00000, profile-level-id=(string)1, config=(string)000001b005000001b50900000100000001200084586a28b42240a21f"
<br>is the caps on the udpsink's sink pad.<br><br>For the most part, it works okay, but ffmpeg reports error in decoding the frames ("header damaged", etc), and the bottom of the image is distorted.<br><br>
If I write it to a file first, and streams it using:<br>gst-launch filesrc location="file.mkv" ! matroskademux ! rtpmp4vpay ! udpsink host=othercomputer<br>it works without any problems.<br><br>I figured it's a timing issue (and not a problem with my encoder plugin) so I've tried adding all sorts of queues along both pipelines, including rtpjitterbuffer, but to no avail. (actually, the rtpjitterbuffer only seemed to made the distortion worse).
<br><br>Any hint to what I'm doing wrong?<br>Or better yet, what is the right way to do rtp streaming with gstreamer? Preferably something that doesn't use rtpbin - I'd like to be able to work with non-gstreamer clients in the future.
<br><br>Thanks,<br>Itay.<br><br>