<div dir="ltr"><div><div><font><span style="font-size:10pt">I have the need to take the
output of a fullscreen application that uses the <span class="">OpenGL</span>
API and encode it to H.264
and stream it. I
see VAAPI with Ivy Bridge supports H.264 encode but I don't see any
explicit examples that demonstrate encoding the output of an
application. <br><br></span></font></div><div><font><span style="font-size:10pt">I'm working on an Ivy Bridge based i7 and will be using the embedded graphics for <span class="">OpenGL</span> rendering at 1280x720.<br>
</span></font></div><div><font><span style="font-size:10pt"><br>I can use <span class="">OpenGL</span> or <span class="">OpenGL</span>
ES and would prefer not to need X as there is no need for windowing.
It seems that I may be able to use EGL for render management too through
Mesa.<br>
<br>What I don't know though is how to get these rendered images directly over to libva for H.264 encode and subsequent streaming.<br></span></font></div><div><font><span style="font-size:10pt"><br></span></font></div>
<font><span style="font-size:10pt">I heard there might be a necessary API to support this in the staging branch which allows </span></font><br><font><span style="font-size:10pt">"creating a VA surface from an external buffer type". <br>
<br>What is this API called and is it planned to be merged to trunk anytime soon?<br><br></span></font></div><font><span style="font-size:10pt">Are there any plans to create a example test application that renders something in some flavor of <span class="">OpenGL</span> and runs it through an H.264 encode in libva?</span></font></div>