jonathan.liger at wanadoo.fr
Sat Jul 17 06:23:02 CEST 2004
> I really wonder what you want to do with MIDI. Do you want to write
> elements that deal with MIDI and alter MIDI events, or do you want to
> turn MIDI events into sound via synthesis?
I don't see any kind of incompatibility between this two goals.
> (Which leads to the question, why duplicate it if it's so nice? Why pass
> MIDI data in a pipeline if the ALSA sequencer can do it much more
The main idea is to ease the development of application using both audio
and MIDI by providing a common interface to manage the audio and midi
A first solution is to integrate the midi pipeline : we have then the
benefits of the gstreamer framework (such as XML pipeline serialization)
and, if we accept a loss in performances (I don't expect a gstreamer-
based effect pipeline to be as responsive as its pure alsa or jack
Another approach would be to do this a level higher, by integrating
gstreamer in lash (http://lash-audio-session-handler.org/, formely known
as ladcca), but I'd like to give a try to the "gstreamer level"
More information about the gstreamer-devel