[Libva] gst-vaapi, uploading to own textures

Dolevo Jay cmst at live.com
Wed Sep 30 10:30:02 PDT 2015


Hi,
I have decided to go for glimagesink path and I have get the basic pipeline up and running with gstreamer 1.6 in my platform. Here is my next questions.1. We use triple buffers in our application. As far as I could read from the gst-vaapi code, the textures are created in gl_create_texture function in  gstvaapiutils_glx.c. However, I couldn't find out where this is used exactly in gst-vaapidecode. Can anybody help?
2. I see in gstglimagesink code that many times next_buffer and next_buffer2 or similarly stored_buffer[0] and stored_buffer[1] are used. Are we using dual buffer there?
3. Is my understanding correct that vaapidecode creates the textures and uploads them and gstglimagesink draws them? 
ThanksRegards
> Date: Tue, 29 Sep 2015 18:11:47 +0200
> From: vjaquez at igalia.com
> To: libva at lists.freedesktop.org
> Subject: Re: [Libva] gst-vaapi, uploading to own textures
> 
> You can either use clutter-gst, (if you are using gstreamer 1.6)
> glimagesink, or craft your on video sink.
> 
> The simplest way is, if you use clutter, embedded a cluttervideosink actor in
> your code.
> 
> If you write your own video sink, you would handle the GstVidoeGLTextureUpload
> meta:
> 
> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst-plugins-base-libs-gstvideometa.html#GstVideoGLTextureUpload
> 
> WebKitGTK+ does that with its own video sink (though, it is moving to use
> glimagesink).
> 
> 
> In the case of using glimagesink, you can connect the 'client-draw' signal and
> render the given texture:
> 
> http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/tests/examples/gl/sdl/sdlshare2.c?h=1.6
> 
> There's another option, juggling with the GL context:
> 
> http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/tests/examples/gl/sdl/sdlshare.c?h=1.6
> 
> 
> vmjl
> 
> On 09/29/15 at 12:38pm, Dolevo Jay wrote:
> > Hi,
> > In my device, I receive video streams from different sources (local or remote). After decoding them, I show them on the display using OpenGL. So far, all decoding was done in software. I was receiving RGB frames from each source and uploading them to certain textures to render them. The sources are now decoded in hardware using gstreamer-vaapi. An example gst line is as follows: gst-launch-1.0 filesrc location=/store/1.mp4 ! qtdemux ! vaapidecode ! vaapisink display=2This works great. However, as you might imagine, vaapisink creates its own wondow and draw the decoded frames onto it. What I would like to do is to feed the textures that I created in my application and feed them to vaapidecode or vaapisink element so that the rendering can happen in my canvas. I have been digging into the vaapidecode and vaapisink elements to see where the textures are uploaded, but couldn't spot the exact line to feed my texture info into. Could anyone help me? A function name, or a line number or any hint would be greatly appreciated.
> > Thanks,
> >  		 	   		  
> 
> > _______________________________________________
> > Libva mailing list
> > Libva at lists.freedesktop.org
> > http://lists.freedesktop.org/mailman/listinfo/libva
> 
> _______________________________________________
> Libva mailing list
> Libva at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/libva
 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/libva/attachments/20150930/e8139945/attachment.html>


More information about the Libva mailing list