[gst-devel] Midi and GStreamer

Leif Johnson leif at ambient.2y.net
Mon Jul 14 12:58:02 CEST 2003

Hi all -

It seems like GStreamer could benefit greatly from a different subclass of
GstPad, something like GstControlPad. Pads of this type could contain
control data like parameters for oscillators/filters, MIDI events, text
information for subtitles, etc. The defining characteristic of this type of
data is that it operates at a much lower sample rate than the multimedia
data that GStreamer currently handles.

GstControlPad instances could also contain a default value like Wingo has
been pondering, so apps wouldn't need to connect actual data to the pads if
the default value sufficed. There could also be some sweet integration with
dparams, it seems like.

Elements that have control pads could also have standard GstPads, and I'd
imagine there would need to be some scheduler modifications to enable the
lower processing demands of control pads.

Unfortunately, as is probably obvious, I don't know enough of the GStreamer
core to tell if this is a good idea or not, but I'd really appreciate
comments. This would be cool if it worked out.


On Wed, 09 Jul 2003, nick wrote:

> Hi All
> The thing I am thinking about is how a gstreamer plugin would handle
> MIDI and audio at the same time... In my mind, this requires the midi
> and audio buffers to be processed on a 1-to-1 basis (so 1 buffer of
> audio and 1 buffer of midi cover the same duration of time).. Does what
> I'm saying make sense to you?
> (For me, I would want to be able to write amSynth as a plugin - this
> would require that when my process function is called, I have a midi
> buffer as input, containing how ever many midi events occurred in, say,
> 1/100 sec for example, and then I generate an audio buffer of the same
> time duration...)
> Any ideas? Maybe this will indicate the kind of problems to be faced.
> Nick
> On Tue, 2003-07-08 at 13:43, Benjamin Otte wrote:
> > On 8 Jul 2003, Christian Fredrik Kalager Schaller wrote:
> > 
> > > GStreamer works best with near-constant data flow so a midi stream
> > > would probably have to consist mostly of filler events, sent at a constant
> > > tick-rate.
> > >
> > I don't really know why it were necessary to have a near-constant
> > dataflow.
> > Subtitling surely works with a very non-constant dataflow and the
> > presentation from Thomas at GUADEC showed that it works.
> > 
> > The only part of GStreamer that might need filler events would be realtime
> > processing, where you want to know that nothing happened for some time.
> > But it should not be necessary for midi file reading or anything else
> > described in that post.
> > 
> > 
> > Benjamin
> -- 
> nixx at nixx.org.uk          |     amSynth lead developer
> JabberID: nixx at jabber.org |     http://amsynthe.sf.net
> -------------------------------------------------------
> This SF.Net email sponsored by: Parasoft
> Error proof Web apps, automate testing & more.
> Download & eval WebKing and get a free book.
> www.parasoft.com/bulletproofapps1
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

Leif Morgan Johnson : http://ambient.2y.net/leif/

More information about the gstreamer-devel mailing list