[Libva] Opengl surface to VaSurface

Gwenole Beauchesne gb.devel at gmail.com
Mon Apr 23 22:06:25 PDT 2012


Hi,

2012/1/5 Sergey Omelchenko <arieserg at gmail.com>:
> Is it possible to send opengl pre-rendered surface to VaSurface for encoding
> without CPU and system memory usage (it`s very slow)?
>
> Planned scheme: h.264 stream ->vaapi decoder -> VaSurface -> OpenGl Surface
> -> Opengl postprocessing -> VaSurface -> vaapi encoder -> h.264 stream.

There is work in progress for this use case, through EGL extensions.
However, since this requires a few but not extensive changes to the GL
stack, this will only be implemented into the Open Source Intel GenX
drivers. Basically, for your use-case, you would create the VA
surfaces pool for decoding + spare VA surfaces for encoding. I have
not considered importing GL textures into VA surfaces yet. That's the
next item, for the GL -> VA part at least.

struct VABufferEGL; /* That's an opaque for implementations */
vaGetSurfaceBufferEGL()
vaGetImageBufferEGL()

vaGetSurfaceBufferEGL() creates two suitable buffers for
eglCreateImageKHR(). Actually, in my current model, VABufferEGL is the
EGLClientBuffer value you pass to this function. However, you have to
select the desired plane with a dedicated EGL attribute. So, for NV12
surfaces for example, while a single opaque is the medium, you still
need two EGLImages, and thus two GL textures, to get Y + UV
components.

I am working on the Mesa bits. Once it's available upstream, libva
will get populated next.

Note: the VA/GLX was designed as a means to get a VA surface into an
RGBA GL texture for all platforms. Even those with proprietary
drivers. However, this had a severe cost: limiting to rendering to
RGBA textures. i.e. not a real VA surface buffers sharing with GL
stacks. VA/EGL will address this issue. Your API can then be GLES2 or
plain desktop GL. Downside: only Intel GenX is supported, since that's
the only driver I have sources for. :)

Regards,
Gwenole.


More information about the Libva mailing list