[RFC v2] surface crop & scale protocol extension
ppaalanen at gmail.com
Sun Nov 10 01:02:09 PST 2013
On Fri, 8 Nov 2013 23:00:28 -0200
Rafael Antognolli <rafael.antognolli at intel.com> wrote:
> On Fri, Nov 08, 2013 at 10:59:07AM -0800, Bill Spitzak wrote:
> > Pekka Paalanen wrote:
> > >Hi all,
> > >
> > >this is the v2 of the crop and scale extension, as RFC.
> > I get the impression that the result of crop+scale is supposed to be exactly
> > the same as though the client made a second buffer of the scale size, scaled
> > the crop region from the first buffer to this second buffer, then attached
> > it with the normal wayland mechanism. Correct?
> From what I understood, the visual result might be that, but it's not
> what should go on inside the renderer.
> > So that compositors are allowed to only keep the cropped area of a buffer,
> > there will have to be limitations on changing the crop+scale without also
> > doing another attach. Maybe it does not work unless there is an attach in
> > the same commit, or you might even require the attach to be after the
> > crop+scale and before the commit.
> IMHO the compositor would keep the entire buffer, just like it already
> does. So when a buffer is attached to a surface, in the case of the gl
> renderer, it would get entirely uploaded as a texture, and just in the
> end when the texture is going to be rendered on the screen, only the
> cropped area would be presented (in a scaled version, if that's the
> case). This would allow to change the crop & scale parameters without
> the need to a new attach.
Yes, that is my opinion about it, too.
The crop & scale extension is usually used with hardware
(EGL-based) wl_buffers, so there is no copy to begin with.
> Unless I'm wrong regarding the "buffer being entirely uploaded to the
> compositor", but that's how I was implementing it.
> > The big problem I see with this api is that if buffer_scale is not one, the
> > client is unable to specify the crop or scale rectangles in pixels. However
> > this is a general problem with buffer_scale everywhere. (actually you seem
> > to be using fixed, not integers, so it is possible if buffer_scale is a
> > power of 2). I would change the crop to be in pixels. The scale rectangle
> > requires fixing or replacing the buffer scale mechanism.
Clients will always specify surface content in blocks of
buffer_scale x buffer_scale pixels. That is how it was before, and
that is how the crop & scale extension uses it.
In other words, everything is still in surface coordinate units,
just like before.
> > You need to define what happens if the crop region extends outside the
> > buffer. I think the edge pixels will have to be replicated to fill, since
> > using transparency here will turn an opaque buffer into one with
> > transparency, and produce an anti-aliased transparent edge, both of which
> > may defeat the ability to use hardware scaling.
> It's defined as "undefined/dirty" pixels, isn't it? I think that's good
> enough for now, at least.
Yes. The compositor is free to do whatever is easiest. Software
compositing might just ignore the part that is outside of the
buffer, and a GL-renderer may simply not care and produces whatever
the GL texture repeat mode happens to be. Also "hall-of-mirrors" (a
type of Doom rendering glitch) kind of rendering artifacts are
allowed. It is the client's fault of not giving proper content.
Not specifying anything particular for the undefined pixels allows
compositors to use the overlay hardware in a simple way, and not
require jumping through hoops to realize e.g. black fill, which
would be part of the surface but likely not realizable with the one
hw overlay unit.
> > I think there may also have to be rules about whether filters are allowed to
> > sample outside the crop region (there are things other than box filters, you
> > know). For security reasons this must not be allowed outside the actual
> > buffer, so that adjacent memory does not leak into the displayed image, but
> > it could be left undefined for pixels between the crop and the buffer edge.
I think that is implied, that it is ok to sample from outside of the
source rectangle, as long as it is inside the buffer. The Wayland
protocol always assumes that buffers are complete and fully filled.
More information about the wayland-devel