[gst-devel] Using gstreamer for modular music synth
SBaker at CHELLO.com
Wed Aug 14 01:32:05 CEST 2002
> I've been developing a modular music synth call Io
> (http://sf.net/projects/iosynth) for a few weeks, and someone told me
> about gstreamer. I've spent the past few hours looking over
> documentation and such, and I really think gstreamer is a very good
> thing, but I have a few concerns. I'm not expecting full
> explanitions on
> how to fix these things; I'm just tring to get a feel for just what
> gstreamer is supposed to do, and if it's the right tool for my job.
Cool. My initial interest in GStreamer was for this sort of application.
> - Realtime support
> One goal of my project was to allow realtime synth, which means the
> plugin system must have a low latency. Is there anything inherit in
> gstreamer that will be a problem?
The latency behaviour of a pipeline should be very good as long as you don't
do things like use threads or change the structure of the pipeline while its
> - Polyphonic instruments + really low-level modules
> One of my main complaints about other software synth programs is that
> each instrument can only play one note at a time. This means
> if you have
> some instrument that's supposed to sound something like chimes, if you
> play a note, it's cut off as soon as the seccond note is played rather
> than letting both ring. My solution to this was to pass C++ vector
> objects down the pipeline so that each note is seperated, and combined
> only when the final sound of the instrument is produced.
I have thought a bit about polyphonic pipelines and I have an idea which
might work. Rather than making every element polyphonic aware, leave all
elements as monophonic and just have multiple duplicated monophonic
sub-pipelines which are eventually mixed to a single source.
This could be done by writing a polyphonic bin. You construct your desired
monophonic pipeline inside that bin, then tell the bin what the maximum
polyphony is and that bin will duplicate your monophonic pipeline n times
internally and mix all the pipelines so it produces a single source.
Of course there are many issues with this approach which will have to be
worked out, but I think the idea has merit.
> - Control of the instruments
> Obviously, there needs to be some mechinism to control the
> such as telling the oscillator when to play, for how long, and what
> note. Ideally, this control would be pluginized so that different
> sources could be used, such as a saved file, a MIDI device, etc.
> However, this data is not bit-based like audio or video, but rather
> event based, so passing it in a buffer would be costly,
> cumbersome, and
> just plain not-good. What are some possible solutions?
This is why I wrote the Dynamic Parameters library. The plugin side of the
library is now quite mature but the apps side still needs some development.
Check out the docs (slightly out of date but should give you a good idea):
More information about the gstreamer-devel