<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 TRANSITIONAL//EN">
<HTML>
<HEAD>
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; CHARSET=UTF-8">
<META NAME="GENERATOR" CONTENT="GtkHTML/4.6.2">
</HEAD>
<BODY>
Le mardi 12 mars 2013 à 00:13 -0700, Jorge a écrit :
<BLOCKQUOTE TYPE=CITE>
<PRE>
Hi Nicolas, thanks for answering!
Could you clarify a little more about the GstVideoContext and GstSurface
role, I would like to know
what is going on?
</PRE>
</BLOCKQUOTE>
<BR>
The GstVideoContext is used to would allow multiple thread to access and exchange buffer on the same display. The context is first created by a sink (for static pipeline) or a video decoder (for dynamic pipeline like playbin2) and shared with neighbor elements at run-time. Because of the nature of GLX, there is machanism to share an X11 display with VA element.<BR>
<BR>
the GstSurfaceBuffer is a subclass of GstBuffer (that was done before GstMeta and/or qdata). It provides extra information (like the context) and allow creating converter.<BR>
<BR>
<BLOCKQUOTE TYPE=CITE>
<PRE>
About the GstVideoContext, in my little X11 application, where I open the
display, I do the 'gst_bus_set_sync_handler'
in order to call 'gst_video_context_set_context_pointer' and set the same
display, but is it needed in order to create
an opengl context in the vaapi side?
</PRE>
</BLOCKQUOTE>
<BR>
On the application side, you should watch for <TT>"prepare-video-context</TT>" message on the sync bus and set appropriate context, depending on your app you could set a <TT>"x11-display"</TT> used in GLX or "va-display" if you already use VA in your application. It's needed on some OpenGL driver (notably Intel) that do not allow DRI buffers to be shared between display instance (even if the display is the same).<BR>
<BR>
<BLOCKQUOTE TYPE=CITE>
<PRE>
And about the GstSurface... are you talking about GstSurfaceBuffer? How this
arrive to my application?
If I use an playbin2 pipeline, is it needed to implement a video sink, like
the ClutterGstVideoSink in order to receive this
buffer or the vaapisink is able to export them?
</PRE>
</BLOCKQUOTE>
<BR>
Yes, it's GstSurfaceBuffer. You may want to use an appsink to get the buffers in your application. You'll have to set appropriate template caps (video/x-surface) otherwise the VA decoder will be skipped. If your intention is not to be 100% HW accelerated, you appsink shall also support at least 1 raw video format. The buffer with caps "video/x-surface" can be casted to GstSurfaceBuffer.
<BLOCKQUOTE TYPE=CITE>
<PRE>
And my last one, Are the clutter libraries needed? I would like to minimize
the amount of libraries on my custom
system, and I would like to avoid pango, cogl, clutter etc, and let the
basic ones.
</PRE>
</BLOCKQUOTE>
These interfaces has no dependencie, any GL code is implemented in the VA elements. Finally, have a look at the documentation: <A HREF="http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/videocontext.c?h=0.10">http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/videocontext.c?h=0.10</A><BR>
<A HREF="http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/gstsurfacebuffer.c?h=0.10">http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/gstsurfacebuffer.c?h=0.10</A><BR>
<A HREF="http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/gstsurfaceconverter.c?h=0.10">http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst-libs/gst/video/gstsurfaceconverter.c?h=0.10</A><BR>
<BR>
Nicolas
</BODY>
</HTML>