[gst-devel] How to fit in a GPU accelerated rendering plugin

Daniel Díaz mrchapp at gmail.com
Sat Dec 15 18:23:14 CET 2007

Hello, Wladimir.

On Dec 14, 2007 10:45 AM, Wladimir van der Laan <laanwj at gmail.com> wrote:
> I have created a GPU-accelerated plugin to playback Dirac video streams.
> This generates a lot of superfluous traffic on the bus. Is there
> infrastructure in place to pass the output of a plugin as a GL texture, for
> direct rendering? Or some other recommended way to do this?

I would be interested to hear the opinions of the GStreamer developers
on this. It has been discussed in the past, but I don't think it
figures in the mailing list.

One thing you can do is create your own hardware accelerated sink and
connect both your plug-ins, thus avoiding data passing over the
general-purpose processor. If you go that route, you could emit fake
buffers from the filter to the sink for the GStreamer pipeline to go
on. This, of course, breaks the GStreamer's purpose of controlling the
data flows.

> I realize I've probably written the first video rendering plugin for Linux
> that is accelerated on graphics hardware, so this might get interesting.

Indeed this is interesting. You might want to consider OpenMAX IL
(look at the diagram here):

What you want to achieve sounds to me like an OpenMAX tunnel between
components (Dirac Decoder and GL Render). On the GStreamer side, this
might be of your interest:


Daniel Díaz
yosoy at danieldiaz.org

> Greetings,
> Wladimir

More information about the gstreamer-devel mailing list