[PATCH 2/2] protocol: Support scaled outputs and surfaces

Pekka Paalanen ppaalanen at gmail.com
Tue May 21 10:57:15 PDT 2013


On Tue, 21 May 2013 08:35:53 -0700
Bill Spitzak <spitzak at gmail.com> wrote:

> On 05/20/2013 11:46 PM, Pekka Paalanen wrote:
> 
> >> Let's say the output is 10,000dpi and the compositor has set it's scale
> >> to 100. Can a client make a buffer that is 10,050 pixels wide appear 1:1
> >> on the pixels of this output? It looks to me like only multiples of 100
> >> are possible.
> >
> > As far as I understand, that is correct.
> >
> > But it does not matter. You cannot employ any widgets or widget parts
> > that would need a finer resolution than 100 px steps, because a) the
> > user cannot clearly see them, and b) the user cannot clearly poke them
> > with e.g. a pointer, since they are so small. So there is no need to
> > have window size in finer resoution either. Even a resize handle in a
> > window border would have to be at least 300 pixels thick to be usable.
> 
> This proposal does not actually restrict widget positions or line sizes, 
> since they are drawn by the client at buffer resolution. Although 

No, but I expect the toolkits may.

> annoying, the outside buffer size is not that limiting. The client can 
> just place a few transparent pixels along the edge to make it look like 
> it is any size.
> 
> However it does restrict the positions of widgets that use subsurfaces.
> 
> I see this as a serious problem and I'm not sure why you don't think it 
> is. It is an arbitrary artificial limit in the api that has nothing to 
> do with any hardware limits.

It is a design decision with the least negative impact, and it is
not serious. Sub-surfaces will not be that common, and they
certainly will not be used for common widgets like buttons.

> The reason you want to position widgets at finer positions is so they 
> can be positioned evenly, and so they can be moved smoothly, and so they 
> can be perfectly aligned with hi-resolution graphics.

But why? You have a real, compelling use case? Otherwise it just
complicates things.

Remember, sub-surfaces are not supposed to be just any widgets.
They are video and openGL canvases, and such.

> > How can you say that? Where did you get the specification of how scaler
> > interacts with buffer_scale? We didn't write any yet.
> 
> It is pretty obvious that if the parent has a scale and the child has 
> one, these scales are multiplied to get the transform from the child to 
> the parent's parent.

A what? No way, buffer_scale is private to a surface, and does not
affect any other surface, not even sub-surfaces. It is not
inherited, that would be insane.

The same goes with the scaler proposal, it is private to a surface,
and not inherited. They affect the contents, not the surface.

> It is true that the resulting scale if the hi-dpi and scaler are applied 
> to the *SAME* surface is not yet written.
> 
> > And what is this talk about parent surfaces?
> 
> The subsurfaces have a parent. For main surfaces the parent is the 
> compositor coordinate space.

There is no "compositor coordinate space" in the protocol. There
are only surface coordinates, and now to a small extent we are
getting buffer coordinates.

Still, this parent reference made no sense in the context you used it.

> >> The input rectangle is in the same direction as the output rectangle
> >> even if the buffer is rotated 90 degrees by the buffer_transform.
> 
> Yes exactly. Thus it is a different space than the buffer pixels, as 
> there may be a 90 degree rotation / reflections, and translation to put 
> the origin in different corners of the buffer.

Glad to see you agree with yourself.

> > How could you ever arrive to non-integer buffer sizes in the earlier
> > proposal?
> 
> If the scale is 3/2 then specifying the surface size as 33 means the 
> buffer is 49.5 pixels wide. I guess this is a protocol error? Still 
> seems really strange to design the api so this is possible at all.

We have one scale factor which is integer. How can you come up with 3/2?

Even if you took the scaler extension into play, that will only
produce integers, no matter at which point of coordinate
transformations it is applied at.

> > Aligning sub-surfaces is still possible if anyone cares about that, one
> > just have to take the scale into account. That's a drawing problem. If
> > you had a scale 1 output and buffers, you cannot align to fractional
> > pixels, anyway.
> 
> If there is a scale of 2 you cannot align to the odd pixels. And  a 
> scale of 3/2 means you *can* align to fractional pixels.
> 
> > Why would pre-compositing not be possible is some case?
> 
> Because it would require rendering a fractional-pixel aligned version of 
> the subsurface and compositing that with the parent. This may make 
> unwanted graphics leak through the anti-aliased edge. The most obvious 
> example is if there are two subsurfaces and you try to make their edges 
> touch.

Umm, but since sub-surface positions and sizes are always integers
in the surface coordinate system, the edges will always align
perfectly, regardless of the individual buffer_scales.

> However both proposals have this problem if pre-compositing is not done, 
> and most practical shells I can figure out can't do pre-compositing 
> because that requires another buffer for every parent, so maybe this is 
> not a big deal.
> 
> > Urgh, so you specify input region in one coordinate system, and then
> > get events in a different coordinate system? Utter madness.
> >
> > Let's keep everything in the surface coordinates (including client
> > toolkit widget layout, AFAIU), except client rendering which needs to
> > happen in buffer coordinates, obviously.
> 
> Sounds like you have no problem with two coordinate spaces. I don't see 

Correct, it is the cleanest solution.

> any reason the size of windows and the positions of graphics should not 
> be done in the same coordinates drawings are done in.

We already went through that. The buffer coordinate system changes
dynamically, and it would expose all these fractional pixel
position/size issues you are so concerned about.

> >>> The x,y do not
> >>> describe how the surface moves, they describe how pixel rows and
> >>> columns are added or removed on the edges.
> >
> > No, it is in the surface coordinate system, like written in the patch.
> 
> Then I would not describe it as "pixel rows and columns added or removed 
> on the edges". If the scaler is set to 70/50 than a delta of -1,0 is 
> adding 1.4 pixels to the left edge of the buffer. I agree that having it 
> in the parent coordinates works otherwise.

We use the units of "pixels" in the surface coordinate system, even
if they do not correspond exactly to any "real" pixels like
elements in a buffer or on screen.

- pq


More information about the wayland-devel mailing list