[gst-devel] proposal for implementing dynamic parameters

Steve Baker sbaker at chello.com
Fri May 11 15:01:47 CEST 2001


> Hi,
> 
> I have a couple of doubts concerning GstDynamicParams. I had envisaged
> this functionality being implemented as though it were an analogue
> data flow using the existing pads mechanism. Also, I disagree with the
> most of the disadvantages of that approach stated in the wiki page:

If I might be presumptuous here, I think the different philisophical
attitudes in this situation are:
1. We have a powerful abstraction of data flow (pads, data flowing from
element to element) so lets use it wherever it makes sense.  We could also
use it for flow of control data and (taken to extreme) other meta
information like events and clock ticks.
2. Each feature of a multimedia framework has its own unique requirements
which would be better served by tailoring an API to meet those requirements
instead of overloading existing APIs which are already quite complex.

> > the scheduler will have to make sure that a plugin has always been
> > fed enough control buffers so that the plugin knows how to
> > manipulate the >data buffers. This added complexity might be hard to
> > maintain.
> 
> But surely if we treat the data flow as an audio stream with a very
> low sample rate then there is hardly anything to add to the
> scheduler. I would have thought that it would take far more work for
> the scheduler to deal with paremeterized control data.

The problem is not only that the control rate is low - in many cases it will
be non-constant and not deterministic.  I don't know enough about the
scheduler to say what effect that would have to the complexity though.

> Granted the interpolation is a really nice feature but surely it would
> be better implemented as a set of filter plugins which take a set of
> parameters and create an analogue data control signal.

I can see how that would work, but I don't see it as better - just
different.

> > when using user interfaces to create graphs, the graph can get
> > unmaintainable very quickly if control data and media data are
> > represented in the same graph. Most situations wouldn't require such
> > a level of sophistication to represent the flow of dparam data.
> 
> Surely things will get far more complicated if there isn't a graphical
> means to represent how control data will be attached to elements. Will
> the control data form the properties of the element it acts upon? 

Yes, through that instance of dynparams and by manipulating the
interpolators.

> If
> so what about the situation where one dataset is to be applied accross
> more than one plugin?

Because the application chooses which interpolator to use, it could just
create an interpolator and attach it to the dynparams of multiple elements
(we would have to support this, of course).

> > a dparams stream will generally be very low bandwidth and data is
> >generated in sporadic bursts. The GStreamer infrastructure is really
> >optimized for streams that produce a mostly-constant stream of data.
> 
> In batch situations this is true but not in real-time usage where
> although changes in the control input are sporadic, the application
> can have no way of telling when changes are going to occur so the data
> has to be sampled continuously.

For true real-time I was imagining that a special RT interpolator would be
attached to the dynparam instance.  This interpolator would have a fixed
control rate which matches the buffer boundry.

The choice and usage of the interpolators is where the real flexibility of
this approach has advantages IMO.

> I'm also don't see the need for giving the control data
> dimensions. Suppose I have a low-frequency oscillator, why should I
> have to say it's specifically for controlling frequency, time,
> magnitude, colour? 

Its up to the element to say what dimension a dynparam is.  And it doesn't
*have* to specify which dimension to use (I may not have mentioned that :).
The minimum of information required would be that you have a linear value
with lower bound x and upper bound y.  Basically the more information you
give to the app, the more chance the app has to make sense of the data for
the user.

> Even if I could say it was all of these things,
> what about when new dimensions are introduced. 

Units of measurement and unit converters should be pluggable - like elements
and types (and interpolators).  It would probably be correct if the core
gstreamer had minimal builtin units and other units + converters could be
added by plugins.

> I'm concerned that this
> will blow up into another caps negotiation scenario.

It should be simple if conversions are kept in a single domain.  Take Time
for example.  Each domain will have a default unit - we could make time's
unit nanoseconds.  Any new unit of time would just have to supply converters
to/from nanoseconds - thus any-to-any conversion in that domain would be
possible.  (common cases could get their own converters for efficiency).

Some conversions could cross domains (eg time to frequency) but I don't
think this would be the end of the world as far as complexity goes - we are
just talking single values here, not stream formats with lots of meta info.

> I see the need for applying parametized envelopes but I believe that
> is at most 1/4 of the control data picture. 

> Generating control data
> from oscillators

Again, you could write an interpolator to do LFO - it would simple take
waveform, frequency, amplitude and offset values and away you go.  You could
even envelope each of those parameters so that every parameter could change
over time.  And the UI will give instant feedback because the interpolate
function would draw a nice wavy line on the screen.

> external real time data 

see previous comment

> and the ability to filter
> control 

You could basically write your own interpolator to do whatever you want.  I
don't know about chaining interpolators together though - I'll have a think
about that one.

> are also essential for many of gstreamer's potential
> applications and I don't feel those issues have been addressed yet. I
> think a better solution is to just treat control data like a
> low-bandwidth audio stream and then build the envelope, oscillator and
> filter elements which operate on those streams.

I don't agree, but I think we have a long way to fall before this thread
turns into a linux-audio-dev-style flamewar.

cheers.




More information about the gstreamer-devel mailing list