[RFC] drm: add overlays as first class KMS objects

Rob Clark robdclark at gmail.com
Thu Apr 28 09:32:52 PDT 2011


On Tue, Apr 26, 2011 at 5:01 AM, Alan Cox <alan at lxorguk.ukuu.org.uk> wrote:
>> A lot of older hardware had one overlay that could be sourced to any
>> crtc, just not simultaneously.  The tricky part is the formats and
>> capabilities: alpha blending, color/chromakey, gamma correction, etc.
>> Even the current crtc gamma stuff is somewhat lacking in in terms of
>> what hardware is capable of (PWL vs. LUT, user defined conversion
>> matrices, gamut remapping, etc.).
>
> Rather than re-inventing enough wheels to run a large truck would it not
> make sense to make hardware sourced overlays Video4Linux objects in their
> entirity so you can just say "attach V4L object A as overlay B"


One thing I'd like to be able to do is use a CRTC as either an overlay
or an primary framebuffer..

For example on omap4, if we have one display plugged in, then we have
3 pipes that can be used for overlay.  If a second display is plugged
in, one of those overlay pipes gets used as the primary gfx layer on
the 2nd display.

The ideal situation would let us share buffers w/ x11 with dri2
(although we need more than just front/back buffer.. you might have >
16 buffers for video playback).  And then in the xorg driver decide
whether to use an overlay or GPU blit depending on what hw resources
are not already in use.

So I'm not completely sure about the v4l2/drm coexistance for
overlays.  (Although for capture devices, it makes 100% sense to have
a way of sharing GEM buffers w/ v4l2.)  On the other hand, flipping
video buffers is kind of (if you squint slightly) just like flipping
buffers in a 3d gl app.

BR,
-R

> That would provide format definitions, provide control interfaces for
> the surface (eg for overlays of cameras such as on some of the Intel
> embedded and non PC devices), give you an existing well understood API.
>
> For some hardware you are going to need this integration anyway so that
> you can do things like move a GEM object which is currently a DMA target
> of a capture device (as well as to fence it).
>
> For a software surface you could either expose it as a V4L object that
> is GEM or fb memory backed or at least use the same descriptions so that
> the kernel has a consistent set of descriptions for formats and we don't
> have user libraries doing adhoc format translation crap.
>
> A lot of capture hardware would map very nicely onto GEM objects I
> suspect and if you want to merge live video into Wayland it seems a
> logical path ?
>
> Alan
> _______________________________________________
> dri-devel mailing list
> dri-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/dri-devel
>


More information about the dri-devel mailing list