[Libva] VAAPI issues with OpenGL 3.1 and above

Cyril Drouet cyril.drouet at immerex.com
Tue Mar 1 17:47:03 UTC 2016


Hello,

I successfully implemented hardware decoding with VAAPI via FFmpeg by copying the data back to the CPU's memory; however, when I tried to use the data directly from the GPU (instead of copying them back) by using VA/GLX to convert the decoded VASurfaces to OpenGL textures, I ran into some issues. With the latest version of libva, vaCreateSurfaceGLX fails all the time when I set OpenGL to 3.1 or above. If I set it to 3.0, then it doesn't fail, and everything works correctly. I downloaded the sources of libva and it fails when checking for GL extensions because of the use of glGetString(GL_EXTENSIONS) which is deprecated. Is that something you guys can fix so that it is available in the latest version? I have implemented a fix on my end (which I can send if you'd like) so it is not a big issue but I'd rather use the official version.

Thanks,
Cyril



More information about the Libva mailing list