[gst-devel] decode RTP from file
gibrovacco at gmail.com
Thu May 20 08:44:12 CEST 2010
On Wed, May 19, 2010 at 9:55 PM, <Le-bol at gmx.de> wrote:
> Hello user,
> i want to create an application for decoding RTP-packets who are stored in the rtpdump file format. The application should detect the used codec and convert all frames to an audio-file, maybe wav.
1. There are two ways to determine the codec configuration of your
captured RTP stream:
If your codecs are all in the list at page 15 of:
http://www.ietf.org/rfc/rfc1890.txt you can safely determine the
payload type from the codec and jump to 3.
2. If the payload type is not in the list (that is "dynamic", or
determined during the signalling/control phase) the most reliable way
to discover the parameters of your codecs is by parsing the
signalling/control protocol information (usually SDP data) and match
the RTP payload type with the negotiated codecs.
rtspsrc already implements this handling for the RTSP control protocol
(you'll find similar elements for other protocols e.g. mmssrc for
unfortunately, I don't know how well it works when connected to a
filesrc (never tried ;) ).
For signalling protocols, telepathy
telepathy-sofiasip) is an excellent abstraction layer.
The signalling/control very often contains what in GStreamer is called
"codec-data" (e.g. "sprop-parameters-sets" in RTSP), that is data
often essential to initialize a (usually video) decoder. It's a naive
step to omit the signalling/control parsing phase, at least for video
3. Once the payload type has been determined and the decoder setup
infos have been retrieved you can use the pcapparse element (here an
example to extract an h264 stream from a tcpdump capture):
gst-launch -m -v filesrc location=capture.pcap ! pcapparse
src-port=3378 ! "application/x-rtp, payload=97" ! rtph264depay !
"video/x-h264, width=320, height=240, framerate=(fraction)30/1" !
avimux ! filesink location=test.avi
The infos you got from the signalling phase should be put in the
"pcapparse/depayloader" and in the "depayloader/muxer" caps.
The src port is used to determine the stream direction in case you
have e.g. a full duplex communication like during a video call.
With more complex pipelines it's even possible to directly play the
rtp capture as if it was a container format. pcapparse Is a really
> It should also detect Comfort Noise and generate DTMF tones.
I'm currently trying to settle down a working implementation for
rfc3389 and I admit I'm a little behind schedule (the biggest problem
being that my digital signal processing books are at about 2500Km from
me). Afaik no working implementations are currently available in
For DTMF generation you can give a look to e.g. telepathy-stream-engine:
> I'm new in this list and don't know the gstreamer very well, but i think it is possible. Can anybody tell me which elements i will need in the pipeline for that and wether it is possible.
> Thank you for your help.
> GRATIS für alle GMX-Mitglieder: Die maxdome Movie-FLAT!
> Jetzt freischalten unter http://portal.gmx.net/de/go/maxdome01
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
More information about the gstreamer-devel