Accelerated scaling on OMX RPi
Mart Raudsepp
leio at gentoo.org
Fri Apr 10 09:13:51 PDT 2015
On R, 2015-04-10 at 22:24 +1000, Jan Schmidt wrote:
> Hi,
> On 09/04/15 14:48, raphord wrote:
> > I've been searching for some time but can't figure this one out... I
> > have 8 threads running on a single Linux box generating 8 UDP streams
> > being multicast on 8 different "channels". This much is working fine but
> > in order to keep the CPU workload I am using frame sizes of 960x540 @ 30
> > frames per second. I have a raspberry pi (not the absolute latest
> > gstreamer code but from about 2 months ago compiled on the pi) showing
> > this stream using the following pipeline: gst-launch-1.0 udpsrc
> > multicast-group=226.1.1.6 auto-multicast=true port=7001
> > caps='application/x-rtp,encoding-name=H264' ! rtph264depay ! h264parse !
> > omxh264dec ! glimagesink This stream works fine and shows the video fine
> > with about 25% CPU utilization. The screen resolution of the pi is set
> > to 1920x1080 and I want to scale the image up to fill the screen. I can
> > make it the right size using videoscale but I'm assuming this doesn't
> > use the hardware because the CPU hits the maximum and complains of the
> > computer being too slow. What do I need to put into the pipeline to
> > double the size of the video in hardware? Alternatively I don't mind
> > hacking the code to double the size if I can be directed where to put
> > this.... I don't need a general solution as the scaling will always be 2
> > times larger.
>
> You can use glcolorscale as a substitute for videoscale which will do
> the scaling using the GPU, however you're better if you write a small
> python script or so to embed the glimagesink window into a window you
> create yourself and just make the window fullscreen - it will directly
> scale the output to fit the window, and avoid the extra texture copy you
> get from using glcolorscale.
>
> - Jan.
That won't work with the dispmanx backend. It's
GstVideoOverlay::set_window_handle is a no-op, and the dispmanx plane it
creates is made to be the size of the video, centered on the screen.
Simple hack in the codebase can get it fullscreen, but to do it properly
we should make it at least implement set_render_rectangle, acting
relative to the whole screen for the time being (until set_window_handle
can take dispmanx plane handles at least).
I would need to work on that eventually, but it's on the backburner
right now. Would be nice if someone else picks it up.
Also, to my knowledge (and experience from adding the aforementioned
hack) glimagesink would scale itself when it is told to render at a
different size - but it isn't out of the box. glcolorspace would be
perhaps useful when you do something else than display on the screen
with glimagesink, e.g feed it to omxh264enc to re-encode at different
size. Though for RPi there are also unfinished patches floating around
to make omxh264dec do any necessary colorspace conversion and rescaling
via its OMX resize component inside omxh264dec (as there's no clear full
tunneling integration approach mapped out yet), which would avoid
glcolorspace for those use cases too.
These things on the current status on RPi keep coming up, so I'm likely
to bring back my blog from the dead over weekend to make a good summary
of these and related gst RPi things soon (should end up on gnome and
gentoo planet at that point)
Mart
More information about the gstreamer-devel
mailing list