[gst-devel] Using gstreamer for modular music synth
daboy at sempiternity.org
Tue Aug 13 20:23:01 CEST 2002
I've been developing a modular music synth call Io
(http://sf.net/projects/iosynth) for a few weeks, and someone told me
about gstreamer. I've spent the past few hours looking over
documentation and such, and I really think gstreamer is a very good
thing, but I have a few concerns. I'm not expecting full explanitions on
how to fix these things; I'm just tring to get a feel for just what
gstreamer is supposed to do, and if it's the right tool for my job.
- Realtime support
One goal of my project was to allow realtime synth, which means the
plugin system must have a low latency. Is there anything inherit in
gstreamer that will be a problem?
- Polyphonic instruments + really low-level modules
One of my main complaints about other software synth programs is that
each instrument can only play one note at a time. This means if you have
some instrument that's supposed to sound something like chimes, if you
play a note, it's cut off as soon as the seccond note is played rather
than letting both ring. My solution to this was to pass C++ vector
objects down the pipeline so that each note is seperated, and combined
only when the final sound of the instrument is produced.
If you don't understand the problem, consider this simple instrument:
A squarewave oscillator is hooked up to a lowpass filter. The lowpass
filter gets its cutoff frequency from a function generator which moves
the cutoff frequency down over the durration of the note. A note is
played, then another as the first is still playing. Now the cutoff
filter must filter two notes, each with a different cutoff frequency.
Thus, the inputs (from the oscillator and the function generator) must
be seperate as there is no way to apply the different filtering to each
note if the notes from the oscillator are combined.
From what I have read of the docs, I imagine this could be implemented
by defining another type, but is this wise? It could make it harder to
connect these polyphonic plugins to other plugins, thus ruining the
flexability that makes gstreamer good. Or, is there another solution to
- Control of the instruments
Obviously, there needs to be some mechinism to control the instruments,
such as telling the oscillator when to play, for how long, and what
note. Ideally, this control would be pluginized so that different
sources could be used, such as a saved file, a MIDI device, etc.
However, this data is not bit-based like audio or video, but rather
event based, so passing it in a buffer would be costly, cumbersome, and
just plain not-good. What are some possible solutions?
More information about the gstreamer-devel