[Libva] Encode from texture source?

Gwenole Beauchesne gb.devel at gmail.com
Wed Jul 31 20:37:50 PDT 2013


Hi Pawel,

2013/8/1 Pawel Osciak <posciak at chromium.org>:

> Is there any way to do a zero-copy encode/video processing from a texture
> source? This would I guess involve turning a texture/X drawable into a
> VASurface? I think Gwenole mentioned some extensions he's been working on
> back in April 2012, but I haven't found any of this in the code or
> examples...

The Intel driver supports GEM buffer and DMA buffer imports. So, if
you can expose a GEM buffer, or a dma_buf from a texture, then you can
create a VA surface from it. Then, you can use VA/VPP to convert to
NV12 tiled, and kick encoding from it. This is what we do for Weston
to encode from the EGLSurface for example.

I could dig some ancient code at the office if you want, with no
guarantee it still works. :)

Note: the recommended way to encode will be through some middleware
(libgstvaapi, OMX, Media SDK), unless you really want to spend some
time on tuning the encoding process with libva.

Regards,
Gwenole.


More information about the Libva mailing list