[gst-devel] thoughts on extradata

Benjamin Otte in7y118 at public.uni-hamburg.de
Mon Mar 15 07:36:14 CET 2004


(FWIW: The Xiph formats do this, too. They require the first 3 packets before 
they can start decoding "normal" data. The theora/vorbis elements solve this 
by refusing seeks until this data is read.)

The questions for me: 
- What is this extradata supposed to carry?
- (How) does the demuxer interpret extradata?

As long as it's just some additional flags you can live with parsing it during 
caps construction. This falls flat however if someone starts adding palettes 
or other big data chunks to the extradata.
I always thought it would be better to make extradata the first buffer and 
require some specific formats to start with an extradata buffer. This falls 
flat if extradata formats vary between container formats. It has problems with 
marking something as extradata, too. And it might not work if extradata 
happens more often.

For anyone interested: Xine solves this by having a special extradata flag on 
buffers. Decoders treat those buffers differently then.

Benjamin


Quoting "Ronald S. Bultje" <R.S.Bultje at students.uu.nl>:

> Hi,
> 
> matroska and quicktime know the concept of pre-stream data. Quicktime
> stores those in 'esds' (MPEG-4 audio/video) or 'smi' (Sorensen 1/3)
> atoms, matroska stores them in a 'codecprivate' EBML chunk (any).
> ASF/AVI have something similar, but store it as an extension to the
> stream header inside the same chunk. We use that for WMA audio, for
> example.
> 
> Now, we don't do much with those right now. We store them into 'flags'
> properties in a GstStructure for WMA. For Sorensen, we actually add a
> set of flags to GstCaps to read those, but that seems fairly random. For
> MPEG-4 audio/video, we don't do anything with it so far (which is a
> reason that MPEG-4 video inside quicktime doesn't always decode
> correctly with current CVS, I have a sample file on my HD).
> 
> So my question is: how will we solve this?
> 
> I could add 'flags%d' properties (guint32) to GstCaps for any format
> needing those. I could also go for something similar to ffmpeg, called
> extradata. I then add a property 'extradata' to GstCaps where needed and
> put the binary data as a GstBuffer in there. For mpeg-4 audio, libfaad
> accepts a binary array of data as init data, just like ffmpeg does for
> mpeg-4 video (as AVCodecContext.extradata). Anyone against me doing
> this? Or do you have better ideas?
> 
> Ronald
> 
> PS yes, this is 0.8.1 material, but I need opinions.
> 
> 
> -------------------------------------------------------
> This SF.Net email is sponsored by: IBM Linux Tutorials
> Free Linux tutorial presented by Daniel Robbins, President and CEO of
> GenToo technologies. Learn everything from fundamentals to system
> administration.http://ads.osdn.com/?ad_id=1470&alloc_id=3638&op=click
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
> 






More information about the gstreamer-devel mailing list