<html><head></head><body><div class="-x-evo-paragraph -x-evo-top-signature-spacer">Hi Cyril,</div><div><br></div><div>Could you file a bug at freedesktop.org? You can attach your patch there too along with details on the issue.</div><div><br></div><div><a href="https://bugs.freedesktop.org/enter_bug.cgi?product=libva">https://bugs.freedesktop.org/enter_bug.cgi?product=libva</a></div><div><br></div><div><br></div><div>Thanks,</div><div><br></div><div>Sean</div><div><br></div><div>On Tue, 2016-03-01 at 09:47 -0800, Cyril Drouet wrote:</div><blockquote type="cite"><pre>Hello,
I successfully implemented hardware decoding with VAAPI via FFmpeg by copying the data back to the CPU's memory; however, when I tried to use the data directly from the GPU (instead of copying them back) by using VA/GLX to convert the decoded VASurfaces to OpenGL textures, I ran into some issues. With the latest version of libva, vaCreateSurfaceGLX fails all the time when I set OpenGL to 3.1 or above. If I set it to 3.0, then it doesn't fail, and everything works correctly. I downloaded the sources of libva and it fails when checking for GL extensions because of the use of glGetString(GL_EXTENSIONS) which is deprecated. Is that something you guys can fix so that it is available in the latest version? I have implemented a fix on my end (which I can send if you'd like) so it is not a big issue but I'd rather use the official version.
Thanks,
Cyril
_______________________________________________
Libva mailing list
<a href="mailto:Libva@lists.freedesktop.org">Libva@lists.freedesktop.org</a>
<a href="https://lists.freedesktop.org/mailman/listinfo/libva">https://lists.freedesktop.org/mailman/listinfo/libva</a>
</pre></blockquote><div><br></div></body></html>