<html>
<head>
<style><!--
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
font-size: 12pt;
font-family:Calibri
}
--></style></head>
<body class='hmmessage'><div dir='ltr'>Hi,<div><br></div><div>In my device, I receive video streams from different sources (local or remote). After decoding them, I show them on the display using OpenGL. So far, all decoding was done in software. I was receiving RGB frames from each source and uploading them to certain textures to render them. The sources are now decoded in hardware using gstreamer-vaapi. An example gst line is as follows: gst-launch-1.0 filesrc location=/store/1.mp4 ! qtdemux ! vaapidecode ! vaapisink display=2</div><div>This works great. However, as you might imagine, vaapisink creates its own wondow and draw the decoded frames onto it. What I would like to do is to feed the textures that I created in my application and feed them to vaapidecode or vaapisink element so that the rendering can happen in my canvas. I have been digging into the vaapidecode and vaapisink elements to see where the textures are uploaded, but couldn't spot the exact line to feed my texture info into. Could anyone help me? A function name, or a line number or any hint would be greatly appreciated.</div><div><br></div><div>Thanks,</div><div><br></div> </div></body>
</html>