Dataflow visual programming software for live media

JS js at drone.ws
Tue Jan 22 12:44:13 PST 2013


I'm working on a dataflow interface for live media generation similar to
tools like PureData, vvvv and Isadora (veejaying, sound production,
interactivity, etc). Currently our dataflow is synchronized but we want
to make it unsynchronized and multi-threaded.

Here is an example screenshot: http://ur1.ca/cl992

We would probably choose between either a push or a pull model (not
both) although we're not sure which one yet. Also, our datas are "hard
typed" and we support just a few types to make things easy for the user
(eg. one type of audio, one type of video). Finally, we provide a
high-level C++ interface for the effects (elements) that tries to make
it easy for unexperimented developers to contribute.

My question is: Could I use GStreamer as the backend (ie. managing the
dataflow)?

Here are the characteristics/constraints of the software:
- QT-based GUI
- allow for many data types:
--- audio (we'll support just one type through the flow, like
audio/raw-float 44.1k)
--- video (again, we'll support just one type here as well, RGBA 32bit)
--- floats and integers
--- strings
--- triggers / "bangs"
--- datastructures such as vectors and maps
--- vectorial (SVG)
--- 3d scenes and textures
--- lambda (ie. elements/plugins can send a copy of themselves in a
buffer) (see this example: http://www.cs.bgu.ac.il/~shemeas/juks/
<http://www.cs.bgu.ac.il/%7Eshemeas/juks/>)
--- any other user-defined datatype should be easily implementable
through a high-level interface
- composite elements (abstractions)
- possibility to change the graph in real-time:
--- unplug, plug pads
--- remove elements from the pipeline
- multi-threaded
--- we're not sure how it will be implemented, either each element will
be on its own thread, or parts of the graph
--- freely moving elements from one thread to the other
- low-latency, especially for audio eg. possibility of priorizing audio
threads over video threads
- eventually portable to mobile platforms

Example of application:

A live performance with a dancer and a video projection. The movements
of the dancers are analysed with OpenCV which influences both audio and
video in real time. A hardware interface on the dancer sends
accelerometer data by wifi through OSC messages, also influencing the
audio and video. The video is generated by a mix of video effects
applied on a camera image. The video effects influence the image that is
projected as a texture on a 3D object, which is then rendered to the
projection screen. Finally, sentences submitted through a webpage are
also sent to the application: the text is rendered as vectorial objects
which are then modified using vectorial effects and projected on the
floor. The application is controlled live: the performer controlling it
manipulates the graph live, rewiring connections between effects,
playing with sliders and generating sound using a microphone.

Thanks,

Tats

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20130122/162f1914/attachment.html>


More information about the gstreamer-devel mailing list