[RFC v2] surface crop & scale protocol extension
Bill Spitzak
spitzak at gmail.com
Wed Nov 13 12:26:41 PST 2013
Pekka Paalanen wrote:
> Is your whole issue based on the premise, that the output resolution is
> not a multiple of output_scale?
I think there is some serious confusion here. Everything here is a
multiple of everything else.
My client is attempting to obey the output_scale on the assumption that
this advertised output_scale will produce the best image. It sees an
output_scale of 3 and assumes the reason is that the pixels on that
output are 3x smaller (1/9 the area) of "standard" pixels.
The client says "If I want to produce the best hi-resolution image, I
need to use a buffer that is 3x larger in each direction and use a
buffer_scale of 1/3".
Then the client says "I want to use a subsurface and the crop+scale api
to blow this 512x512 image up to cover exactly 1024x1024 pixels in my
main buffer" (on the assumption that this is also 1024x1024 pixels on
the output).
However instead of sending a 1024x1024 square to the compositor for the
dst area, it has to send a 341.3333333 square using fixed-point. This
requires everybody to agree on rounding rules, and is misleading because
the crop+scale only will work for integer numbers of pixels.
It also has to set the buffer_scale of the subsurface to 1, otherwise it
is impossible to specify a 512x512 source rectangle because that is not
using fixed point.
Please explain if I have gotten this analysis wrong.
My recommendation: buffer_scale is ignored for the crop rectangle (which
should then be in the same space as the buffer width and height values,
which I think is "after" the buffer_transform, not before it). That
allows buffer_scale to be applied to the dst rectangle. The client would
then set the buffer_scale of the subsurface to 1/3 and can set the dst
rectangle to 1024.
More information about the wayland-devel
mailing list