[gst-devel] streaming over mobile IP and multicast IP

Wim Taymans wim.taymans at chello.be
Wed Jan 15 13:32:04 CET 2003


On Wed, 2003-01-15 at 22:54, Luis Rodero wrote:
> El mié, 15-01-2003 a las 17:48, Wim Taymans escribió:
> 
>     > The raw file data coming from filesrc doesn't have any timestamps.  I
>     
>     Exactly.
>     
>     > don't know if udpsink/udpsrc are set up to pace data based on the
>     > timestamps, but they should be, and then you would put mp3parse between
>     
>     UDPsink is able to pace data based on timestamps, provided there is a
>     plugin in front of it that generates those timestamps (currently no
>     plugin exists that can timestamp but not decode mp3 data AFAIK). You
>     also have to be aware that UDP drops packets when the network is
>     flooded.
> 
> I am getting lost here, sorry. I assumed that udpsink -> udpscr do not
> make any changes in the data stream (they are transparent, udpsrc
> delivers exactly the same stream of bits that udpsink receives). 

No, UDP is lossy, it drops frames when the network is flooded. if you
need guaranteed delivery you should use a higher layer in the OSI stack
like TCP, for example.

The way UDP usually works is by pacing the data rate at the sender so
that the network doesn't get flooded. 


> If they
> are, what is the difference for the mad plugin?, I mean, the mad plugin
> receives just the same data through the udp plugins that it would
> receive directly from filesrc, does not?. So, why it is not possible for
> tha mad plugin to do just the same with the data? (sure these are quite
> basic questions, but I am a bit confused at this stage...).

Packet loss.

> 
> By the way, a very basic question: is the scheduler who decides when to
> pull the data from a plugin to push it to the next one? if so, how does
> the scheduler know when to pull and push the data?. Or maybe is that the
> next plugin is who pulls the data from the previous plugin whenever it
> needs?.

Are you sure you want to know? :) I would suggest to take a look at
docs/random/wtay/scheduling_ideas to get a first impression on how
scheduling works.

> 
>     
>     Adding timestamps to mp3parse wouldn't be hard.
> 
> So, a solution would be to parse the data to insert time stamps in the
> sender, and to 'un'parse the data in the receiver?, oh, uf...

Either make an element that adds timestamps to the raw mp3 bytestream so
that udpsink can perform rate control. This would make sure that data is
pushed onto the network at approximatly the bitrate of the mp3. This
probably works best if you put those timestamps on indiviual mp3 frames
(in the case of packet loss, you only lose one frame and subsequent
frames can still be decoded independently (only for layer < 3)). There
is no need to unparse data in udpsrc, any gstreamer mp3 decoder should
be able to operate on any buffer size. Note that you have to parse the
mpeg audio stream to figure out the packet length and the bitrate, you
don't have to modify the stream, just divide it unto chunks and add a
timestamp to individual chunks.

For layer 3 mpeg audio streams one usually adds a layer above UDP like
RTP, together with an algorithm as described in RFC3119. This RFC
requires you to slightly reorganize the mpeg data so that individual
frames are less interdependent. RTP is usually a better choise than UDP
for streaming media content..

> 
> (ey, willy, just between yo and me, maybe there is another solution
> easier? ;-) ) 
>     
>     > filesrc and udpsink.  Then the issue becomes one of global clock
>     > management, which wtay will have to answer.
>     
>     There is currently no gstreamer support for syncing clocks accros
>     machines, it is however possible to create a new clock that does this
>     or one could sync the clocks at a lower layer with ntp or so.
> 
>     
>     about the multi-receiver case: I don't think it is supported yet
>     
>     about capsnego over udp: udp does (minimal) capsnego, it serializes the
>     caps over a TCP socket to the receiver. This also explains why the
>     multi-receiver case doesn't work too well...
> 
> uh? what are caps, please?.

Caps is the gstreamer way of communicating media types between different
elements, stuff like samplerate, channels, framesize etc..

Wim


-- 
Wim Taymans <wim.taymans at chello.be>





More information about the gstreamer-devel mailing list