<html><head><meta http-equiv="Content-Type" content="text/html charset=windows-1252"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;">Hi folks,<div><br></div><div>I’m building some stuff based on GStreamer on Raspberry Pi. I’m wanting to (a) display an overlay on top of the video feed and (b) split and possibly distort the incoming video for display on the Oculus Rift. I have been using eglglessink with GST 1.2.3 so far, although I see that that seems to have been superseded recently by glimagesink - it’s a bit hard to keep up!</div><div><br></div><div>(a) seems to be quite doable, and I’ve already built some stuff that interacts with the dispmanx API to create display <a href="https://github.com/monsieurpod/raspifpv/blob/master/src/gstreamer_renderer.c">contexts for eglglessink</a> and <a href="https://github.com/monsieurpod/raspifpv/blob/master/src/egl_telemetry_renderer.c">my own overlay</a> and it appears to function adequately.</div><div><br></div><div>I’m not sure how to go about (b), however, particularly given recent developments. Seems like glshader/glimagesink would be perfect for this (there’s even a <font color="#4787ff"><u>tutorial</u></font> on it!), but gst-plugins-bad 1.3.1’s source package isn’t in the Raspbian repository yet and given that it’s only 8 days old, I wasn’t sure if it’d be fully operational yet anyway, on RasPi.</div><div><br></div><div>So: Any chance of a pointer, to save me a bit of messing about? Should I be starting to use glimagesink, with a glshader? Is it likely to work with gst_video_overlay_set_window_handle (in order to provide my own graphical context) in a similar way as eglglessink, so I can still use an overlay?</div><div><br></div><div>Many thanks, and cheers for your ongoing work.</div><div>Michael</div></body></html>