[gst-devel] audio/raw float caps format

Ronald Bultje rbultje at ronald.bitfreak.net
Sun Jun 29 02:27:08 CEST 2003

Hey Leif,

On Sun, 2003-06-29 at 01:49, Leif Johnson wrote:
> While I'm no audio expert either, it sounds like a good call to replace the
> float `layout' property with a `width' property, for two reasons : (a) int
> audio has a `width' property as well, making this more of a common property
> for audio, and (b) we could get rid of this dependency on gtypes, for what
> it's worth. I also think having a float width of 32 or 64 also makes the
> data understood more clearly as IEEE standard floats---that is what we're
> using, right ? : )

Sounds like a good thing, I'll make sure it does that, too. It means
someone will have to write a float endianness/width convertor, or make
sure that audioconvert does that, too. If someone could implement the
int2float/float2int into audioconvert, too, that'd be really cool.
audioconvert would be for audio what colorspace is for video. :).

> Also, since floats and ints both have endianness issues, that seems to be a
> common property. Finally, do we only want to support one- or two-channel
> audio ? While that could be fairly limiting, it's likely that multichannel
> GStreamer audio apps will want to do all their processing in multiple
> one-channel pipelines anyway. (I've never heard of 16-channel interlaced
> audio before ...)

width is only important for raw audio (float/int), not for something
like mp3 (which has no width, it's compressed!). And width is the memory
size (including padding), depth is the used memory (without padding). So
24-bit audio can fit in 32-bit memory areas, and then width would be 32
(32 bits allocated per sample), and depth 24 (used memory per sample).
Same goes for endianness: only for raw audio.

Lastly, since you seem to know more about audio than me: Thomas told me
that float audio doesn't have a variable number of channels per pad
(it's always 1) because the float data goes over different pads per
channel. Is there any way in which src/sink plugins know which incoming
pad connections belong to which stream?
I.e., if I have two stereo float stream connections between two plugins,
how does the sink plugin know which 2x two of the four pads belong to
the same stream?


Ronald Bultje <rbultje at ronald.bitfreak.net>

More information about the gstreamer-devel mailing list