Streaming an opengl texture over a network

Edward Anon blakat360 at
Tue Jun 15 20:45:04 UTC 2021

Hi yu,

I am doing some real-time computation on multiple synced video streams.

I am currently using OpenGL compute shaders to do this. Hence, the current
frame of each stream is bound to a texture and passed to the shader. The
shader also needs access to some extra information between invocations so I
need to swap textures in and out between calls.  I want to stream the
output of this shader to another machine which will load that information
into an OpenGL texture and render it to the screen. I can't stream directly
to the screen as I am using OpenGL to render a rudimentary GUI on top of
this. I'm open to alternative solutions on this end of the pipeline.
However, for other projects, I will need to stream into a texture and use
that later so I thought it would be a good thing to learn now.

I've looked at the GStreamer docs for GstGLShader and I'm not sure how I
would set something like that up. There don't seem to be any signals to
change the bound textures between invocations. Also, I'm a total noob with
GStreamer so this would be my first foray into the OpenGL plugin. Do you
have anything closer to a user guide than the quick reference the docs and
gst-inspect provide? I have a lot more questions and don't want to waste
your time.

If the above is not possible I can modify my code so that the shader does
not need to change the textures it has bound between invocations.


On Tue, 15 Jun 2021, 20:43 Yu You, <youyu.youyu at> wrote:

> It is not clear what you are trying to do exactly. But you can use
> glshader for GLSL shadering and then use the same memory type (no need to
> gldownload etc.) to HW  encoder like NVEnc (AVC or HEVC); and then packtize
> the encoded byte stream using corresponding RTP pays.
> Regards,
> Yu
> On Mon, 14 Jun 2021, 15:40 Edward Anon via gstreamer-devel, <
> gstreamer-devel at> wrote:
>> Hi all,
>> I want to stream an OpenGL texture over a network.
>> I'm struggling with getting the texture into a GStreamer pipeline. I've
>> seen other users resort to using appsink and appsrc elements, but I want to
>> keep the texture in the GPU and do encoding and decoding there.
>> I've looked at the website and the source (in gst-plugins-base for the
>> OpenGL plugin documentation. However, there seems to be no documentation
>> beyond pads and signals, making it very difficult to learn how to use the
>> plugin.
>> Any pointers to relevant docs/resources would be much appreciated, and
>> working code would be too good to be true ;)
>> Thanks!
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the gstreamer-devel mailing list