Receiving RTP without udpsrc

Tim Müller tim at centricular.com
Sun Dec 25 12:48:27 UTC 2016


On Sun, 2016-12-25 at 12:11 +0100, wsnark at tuta.io wrote:

Hi,

> At prototyping phase I'm trying to create a PoC using "gst-launch-
> 1.0", but I cannot find a way to create a working pipeline to play
> RTP stream from a pipe instead of udpsrc.
>  (...)
> 
> Changing udpsrc to filesrc doesn't work:
> gst-launch-1.0 filesrc location="/tmp/pipe" !  "application/x-rtp,
> media=(string)audio, clock-rate=(int)8000, encoding-
> name=(string)PCMU" ! rtppcmudepay ! mulawdec ! pulsesink
> 
> Sending part: 
> gst-launch-1.0 filesrc location="test.wav" ! wavparse ! audioconvert
> ! audioresample ! mulawenc ! rtppcmupay ! filesink location=/tmp/pipe
> 
> (...)
> If I capture incoming stream to file, then I'm unable to play it
> either (same behavior). If I remove RTP elements from the pipeline,
> raw PCMU is played fine.
> 
> So my questions are:
> 1. Is it possible to play RTP stream without udpsrc using gst-launch-
> 1.0?
> 2. Is it possible to implement this in code, in own application?

It is definitely possible to do this in code in a proper app.

With gst-launch-1.0 it's a bit more involved.

The reason replacing udpsrc with a filesrc or fdsrc doesn't work is
that with RTP/UDP the packetisation of the data matters. The RTP
elements need to know where a packet starts and ends, and by convention
one buffer represents one UDP/RTP packet received.

If you just dump the RTP data into a pipe or file, then you turn it
into a stream and the packetisation is lost.

You can use the rtpstreampay/rtpstreamdepay elements to maintain the
packetisation, but there are more things to look out for.

The RTP elements (esp. rtpjitterbuffer) expect that packets are
timestamped with the capture time, so that it can calculate clock drift
between sender and receiver. So even if you use rtpstreampay/depay that
won't give you timestamps. It'll still work for testing purposes to
some extent, but it won't work nicely in real-world scenarios.

In an app you could write some code to timestamp buffers coming out of
rtpstreamdepay with the clock running time. (ignoring the delay/jitter
between actual capture and sending it through the pipe).

In case it doesn't have to be a pipe, there's also e.g. shmsink/src:

gst-launch-1.0 audiotestsrc ! mulawenc ! rtppcmupay ! shmsink socket-
path=/tmp/foo shm-size=1000000 -v

gst-launch-1.0 shmsrc socket-path=/tmp/foo do-timestamp=true is-
live=true ! 'application/x-rtp, media=(string)audio, clock-
rate=(int)8000, encoding-name=(string)PCMU' ! rtpjitterbuffer
latency=50 ! queue ! decodebin ! alsasink

If you want to handle the transport bits (pipe etc.) all yourself you
could also just inject buffers into a pipeline using an appsrc element.

Cheers
 -Tim

-- 
Tim Müller, Centricular Ltd - http://www.centricular.com


More information about the gstreamer-devel mailing list