VAAPI + meta:GstVideoGLTextureUploadMeta

Víctor M. Jáquez L. vjaquez at igalia.com
Mon Jan 18 02:32:43 PST 2016


Hi Christophe

On 01/17/16 at 11:32am, Christophe Lafolet wrote:
> Hello,
> 
> I've troubles with VAAPI and OpenGL :) 
> 
> I'm developping a Java OpenGL application with dynamic streams and use
> DecodeBin + AppSink to be notified of buffer reception.
> 
> 
> No problem when I disable VAAPI plugin selection and do texture upload
> myself.
> 
> Now I want to use hardware decoding 
> 
> It's not clear to me which API should I use to shared openGL context with
> VAAPI
> 
> - in this post :
> http://ystreet00.blogspot.com.au/2015/09/gstreamer-16-and-opengl-contexts.html
> the new way should be to set a context on GST_MESSAGE_NEED_CONTEXT reception
> but the VAAPI plugin seems only verify that the given context contains a
> VAAPI display and delegate to element_class->set_context() but this callback
> is unset for VaapiDecode so no context is saved  
> Question : why do I need to add a VAAPI display in this context ? DecodeBin
> is a black box for me and I'm surprised to specify a VAAPI display.

The virtual method call to parent class is a requirement for gstreamer 1.7:
https://bugzilla.gnome.org/show_bug.cgi?id=757598#c2

But you are using GStreamer 1.6, that's why that method is not resolved by the
GstElement class.

gstvaapidecode negotiates the GstGLContext in the allocation query because it
needs to know which type of display backend to use in the
GstVideoGLTextureUploadMeta callbacks (GLX or EGL). But it doesn't hold the
context. The meta callbacks are going to be called in the rendering context.

But that's a good catch: perhaps vaapidecode should request for a GstGLContext
through the GstContext mechanism to know earlier the used display backend, and
to avoid the propose_allocation() function in the appsink. Nonetheless, that will
not solve your problem.

The mechanism explained in Matthew's blog post is when you use glimagesink as
a substitute of appsink, using its signal 'client-draw', where you will get a
GstBuffer with a texture to render.

This mechanism is used (experimentally) in WebKitGTK+:

https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp#L727


> - the old way seems to use allocation query :
> AppSink do not provide configurable "signal" to add callback to handle
> allocation query : I update the AppSink.propose_allocation field directly
> from Java with my callback and it works ...   yes
> So, I add GstVideoGLTextureUploadMeta to the allocation query :
> "gst.gl.GstGLContext" with a gl_context build with
> gst_gl_context_new_wrapped(), VAAPI detects my parameters
> (GST_GL_PLATFORM_GLX, GST_GL_API_OPENGL) 
> but a SIGSEGV is generated later.
> I receive samples containing no raw data (OK), a VideoGLTextureUploadMeta
> with one texture (OK) but can't upload

That's the way it should work.

The meta callbacks should be called under the application gl context and they
will upload the buffer into the texture:

GstVideoGLTextureUploadMeta* meta;
if ((meta = gst_buffer_get_video_gl_texture_upload_meta(buffer))) {
  if (meta->n_textures == 1) { // BRGx & BGRA formats use only one texture.
    guint ids[4] = { texture.id(), 0, 0, 0 };
    if (gst_video_gl_texture_upload_meta_upload(meta, ids))
      return;
  }
}

Which vaapi backend are you using? Some how I guess that is the vdpau one.

Can you open a bug with the back trace?

> Is there a example to use VAAPI + meta:GstVideoGLTextureUploadMeta ?

the element gstglupload does a good job

http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/gl/gstglupload.c

> 
> 
> My config : 
> Linux + X11 + GLX
> gstreamer 1.6.2-1
> gstreamer-vaapi 0.7.0
> gstopengl not used.


vmjl


More information about the gstreamer-devel mailing list