[gst-devel] MPEG over RTP

Ralph Meijer gstreamer-devel at ralphm.ik.nu
Wed Jul 16 08:12:13 CEST 2003


Hi,

I've been working on using GStreamer for streaming MPEG video over RTP and
have encountered a few problems. I will first sketch the context, using
a bit of MPEG terminology.

Looking at what is available, I found the rfc2250enc element in the mpegstream
plugin. It is supposed split the MPEG video stream in small fragments that
would fit in a RTP packet and have certain boundary requirements as specified
in RFC 2250.

To actually transport the MPEG stream, the fragments that are generated by
rfc2250enc need to be wrapped in an RTP packet. RFC 2250 also specifies how
such a packet should be composed.

Besides the (display) timestamp in the RTP headers, additional MPEG-specific
headers are added to each packets' payload.  These MPEG-specific headers should
contain information like: picture type (I, P, B), temporal reference number
inside the GOP (Group of Pictures), and some start-of-... markers. The
timestamp and the other metadata is extracted from the MPEG stream.

My first approach was to create the MPEG-specific header in the rfc2250enc
component and use another mime-type between rfc2250enc and a new component
called rtpmpegenc. The timestamp would be passed on via the buffer timestamp.

On the receiving side there is a component rtpmpegparse that strips the
all headers off and transforms the RTP timestamp back to a buffer timestamp,
the buffers now containing pure MPEG again.

Although this works, I've run into a interesting problem. MPEG is encoded in
such a way that the pictures are not transported in display order, and as
such the timestamps are not monotonically increasing:

Pictures:         I    B    B    P    B    B    P    B    B

Transport order:  0    1    2    3    4    5    6    7    8
Display order:    2    0    1    5    3    4    8    6    7
Timestamp:       80    0   40  200  120  160  320  240  280

The RTP packets coming out of rtpmpegenc also have the display timestamp of
the picture the payload belongs to. When feeding these RTP packets to udpsink,
the network gets bursts of packets. These bursts are composed of the RTP
packets carrying the payload of 3 pictures in stead of one. This is caused
by the non-monotonicity of the buffer timestamps.

A way to fix this problem would be to use the transport order of the pictures
for generating the buffer timestamps, but then I have to communicate the
timestamp in some other way, for example by extending the payload of the
buffers with a new field containing the timestamp. But then I might as well
create the whole RTP packet, right?

So my idea would be to combine the two components into one that takes
an MPEG stream as it input and outputs RTP packets.

Now my questions:

 - Is such a merge a good idea (the GStreamer Way (TM))?
 - Where would it have to be placed, inside mpegstream (because it needs
   the MPEG parser) or inside rtp (because that already has some RTP
   stuff obviously).

Hmm, this has become quite long... Sorry about that ;-)

-- 
Groetjes,

Ralphm




More information about the gstreamer-devel mailing list