[gst-devel] Midi and GStreamer
nick
nixx at nixx.org.uk
Fri Jul 11 21:27:05 CEST 2003
Hi All
The thing I am thinking about is how a gstreamer plugin would handle
MIDI and audio at the same time... In my mind, this requires the midi
and audio buffers to be processed on a 1-to-1 basis (so 1 buffer of
audio and 1 buffer of midi cover the same duration of time).. Does what
I'm saying make sense to you?
(For me, I would want to be able to write amSynth as a plugin - this
would require that when my process function is called, I have a midi
buffer as input, containing how ever many midi events occurred in, say,
1/100 sec for example, and then I generate an audio buffer of the same
time duration...)
Any ideas? Maybe this will indicate the kind of problems to be faced.
Nick
On Tue, 2003-07-08 at 13:43, Benjamin Otte wrote:
> On 8 Jul 2003, Christian Fredrik Kalager Schaller wrote:
>
> > GStreamer works best with near-constant data flow so a midi stream
> > would probably have to consist mostly of filler events, sent at a constant
> > tick-rate.
> >
> I don't really know why it were necessary to have a near-constant
> dataflow.
> Subtitling surely works with a very non-constant dataflow and the
> presentation from Thomas at GUADEC showed that it works.
>
> The only part of GStreamer that might need filler events would be realtime
> processing, where you want to know that nothing happened for some time.
> But it should not be necessary for midi file reading or anything else
> described in that post.
>
>
> Benjamin
--
nixx at nixx.org.uk | amSynth lead developer
JabberID: nixx at jabber.org | http://amsynthe.sf.net
More information about the gstreamer-devel
mailing list