[gst-devel] change audio buffer duration

Julien Isorce julien.isorce at gmail.com
Wed Jul 8 15:35:51 CEST 2009


2009/7/8 Sebastian Dröge <sebastian.droege at collabora.co.uk>

> Am Mittwoch, den 08.07.2009, 12:44 +0200 schrieb Julien Isorce:
> > Hi,
> >
> > I have an avi file which contains audio/x-raw-int (and video, but my
> > question is just about the audio).
> > There is the caps:
> > caps = audio/x-raw-int, endianness=(int)1234, channels=(int)2,
> > width=(int)16, depth=(int)16, rate=(int)48000, signed=(boolean)true,
> > codec_data=(buffer)1000000000000100000000001000800000aa00389b71
> > and
> > (type: 118, taglist, audio-codec=(string)\"Uncompressed\\ 16-bit\\ PCM
> > \\ audio\";)
> >
> > Using identity and -v, I can see that buffer duration is around 10 sec
> > and the total is 20 sec.
> > So there is only 2 audio buffers.
> >
> > Is there a gstreamer element that can change or split this buffer
> > duration ? (Usually audio buffer duration is about 20 or 50 ms)
> >
> > There is also the "gst_query_set_latency" but what would be the inpact
> > on the video (video buffer duration) ?
> >
> > Usually I configure the audio latency (=audio buffer duration) when
> > using alsasrc, but how to do that with an avi file?
> >
> > Finally I can see :
> >
> > Implementation:
> >       Has getrangefunc(): gst_base_transform_getrange
> >       Has custom eventfunc(): gst_base_transform_src_event
> >       Has custom queryfunc(): 0xb7916800
> >         Provides query types:
> >                 (3):    latency (Latency)
> >
> > in gst-inspect-0.10 audioresample
> >
> > So audioresample is able to only change the latency ? any example ?
>
> There's no "audiosplit" element that does what you want, it should be
> quite easy to implement though (do you want to do it? :) ).


In some cases where the source is encoded in a bad way,  it happens that the
audio buffers are very big.
i do not know how much is the buffer size of an audio device renderer. But
it would be better to split the buffers before to give them to the device.
Moreover, I do not understand why an element is needed. It should be
configurable through gstreamer API.

For video buffer, the smaller thing is 32 bits (for example), all the 32
bits packets of an image have a meaning together (a whole frame). But for
audio, the smaller thing is 16 bits (for example) but we can group them
where we want, we just need to keep order.
I not sure to be clear.


> The latency query is something different though and the latency is not
> influenced by the buffer sizes. Simple said it's the amount of time that
> is buffered inside the element.


you right, I missed to say "with a given and fixed rate", then latency gives
the buffer sizes.

>
>
> audioresample adds some latency to the pipeline but has no effect on
> buffer sizes, instead it changes the sampling rate of the audio.

ok
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20090708/9238c775/attachment.htm>


More information about the gstreamer-devel mailing list