[RFC v2] surface crop & scale protocol extension
ppaalanen at gmail.com
Thu Nov 14 00:29:52 PST 2013
On Wed, 13 Nov 2013 11:30:19 +0100
Alexander Larsson <alexl at redhat.com> wrote:
> On ons, 2013-11-13 at 11:54 +0200, Pekka Paalanen wrote:
> > On Tue, 12 Nov 2013 16:14:36 -0800
> > Bill Spitzak <spitzak at gmail.com> wrote:
> > > Pekka Paalanen wrote:
> > >
> > > >> The source rectangle *must* be in buffer pixels. Putting it in
> > > >> buffer_scale units makes absolutely no sense, as the buffer_scale is in
> > > >> effect ignored for this surface, overridden entirely by the scaling.
> > > >
> > > > That means that dst_width and dst_height will be required to be in
> > > > multiples of buffer_scale.
> > >
> > > I am rather confused about this and possibly misunderstanding what
> > > Wayland is doing. Here is an example of what I think the current design is:
> > >
> > > Imagine there is an output_scale of 3 (chosen to not be a power of 2). A
> > > client is aware of this and wishes to draw a full-resolution display
> > > that is 2000x2000 pixels and make a subwindow that scales a 512x512
> > > picture to fill a 1000x1000 pixel square.
> > Is your whole issue based on the premise, that the output resolution is
> > not a multiple of output_scale?
> > Just like you deduced, it won't work. It not working has nothing to do
> > with crop & scale state.
> > Alexander, did you have any ideas for the case when someone sets
> > output_scale so that output resolution is not divisible by it?
> The output scale is a hint to the client what kind of pre-scaling it
> should apply to the buffer to best match the real output. There is no
> real guarantee that this is an exact match, because the compositor may
> be projecting it on a 3d-rotated spherical monitor or whatever. It is
> *commonly* the case that we have a regular rectangular output where the
> output scale is an integer divisor of the real output resolution such
> that the buffer can be used unscaled in the scanout framebuffer.
> I.e. the case of a 2560x1600 native display is typically run with a
> output scale of 2, which gives a maximized window a surface coordinate
> space of 1280x800.
> However, a compositor is also free to do something else, for instance
> like OSX it can allow using a 1440x900 global coordinate space, while
> saying output scale is 2. This would result in the client sending
> 2880x1800 buffers for fullscreen windows that the compositor will have
> to downscale (by a non-integer scaling factor) to draw to the
Right, but I am specifically thinking about the case where directly
scanning out a client is (or should be) possible. The issue is the
Assume an output resolution 2000x2000, with output_scale=3. That means,
that there is no surface size in integers that would result in exactly
2000x2000 output area. You'd either have surface size 666x666, which
becomes output 1998x1998, or 667x667 which becomes 2001x2001.
We require, that the buffer size for a surface is a multiple of
buffer_scale. If buffer_scale is set to 3 as intended, the client
cannot create a legal buffer, that would be the size of the output in
If fact, regardless of what buffer_scale the client chooses, it cannot
cover the output exactly with the surface, since we have the surface
coordinate system in between and assume that surface sizes are always
Originally I had written more flow of thoughts here, but I need to
think this more; is it actually a problem, or is it possible for a
compositor to just do what is intended rather than following the
coordinate specifications to the letter and nitpick about a pixel or
two, when a client actually uses that 2000x2000 buffer resulting in
some fractional surface size.
More information about the wayland-devel