[PATCH] compositor: implement a stack of surface transformations
Pekka Paalanen
ppaalanen at gmail.com
Mon Jan 9 00:43:47 PST 2012
On Fri, 06 Jan 2012 10:55:22 -0800
Bill Spitzak <spitzak at gmail.com> wrote:
> This is why I would prefer that the clients do full screen by finding
> out the size of the monitor and directly setting the transform to how
> they fill the screen. Then the client knows exactly what transform is
> happening, and can concatenate any other transforms in it's own code.
Clients do not know their absolute position, therefore they do not know
which output to pick for being maximised on. If you want to add a
request for finding out the output first, you add a roundtrip and a
race.
If you add protocol to inform clients about their absolute position, you
add races. If you let clients solely mandate their position, you can't
move frozen clients, and add races against e.g. output hotplugging. If
clients have any knowledge of the display coordinate system, you are
forced to relay all possible compositing transformations to the client,
so that clients can make any sense of it, which is not feasible.
I guess this has been explained many times, that is just my
understanding of it.
That still doesn't exclude people inventing protocol extensions that
expose display coodinate systems for their special cases.
> Clients are going to need to know the actual transform anyway. They
> may very well be controlling hardware that is outside the
> compositor's control, such as other specialized pointing devices or
> projectors.
I would leave that case for a special compositor interface, perhaps
even for special compositors. Or for compositor plugin framework for
adding a "driver" for it.
> The compositor must limited the "fixing" of misbehaving windows to
> clipping and/or padding with solid rectangles, rather than altering
> the client's requested transform.
For maximised, we probably won't do anything. As you know, attaching a
buffer means resizing the surface. Misbehaving is misbehaving, no
matter what we do. The default semantic is pixel-for-pixel in a normal
state.
For fullscreen, the semantics are to cover a "full screen" or an
output, one way or another. We already had that conversation on IRC.
> As always, I very strongly believe the client *must* have final say
> over the position and contents of the windows, including the borders.
> This is the only way to avoid the latency problems that make X work
> so badly.
Content and decorations, yes. Position, no, according to the current
design. I don't see any roundtrips or latency problems in the current
design ideas for any of this.
> Pekka Paalanen wrote:
> > Having at most one transformation object attached to a surface is
> > not enough anymore. If we have a surface that needs to be scaled to
> > fullscreen, and then we have the zoom animation, we already need two
> > transformations combined.
>
> > Note: surface drawing should honour all kinds of transformations,
> > but not damage region code nor input event translating code take
> > transformations into account, AFAICT. Therefore anything but
> > translation will probably behave badly until they are fixed.
>
> The dirty bit is probably useful for deferring the inverted matrix
> calculation that is necessary for these.
>
> You will need to find the determinant even without any input/damage
> events, because non-invertible matrices should cause the surface to
> be unmapped (since at least one axis is scaled by zero).
Maybe, or maybe we pseudo-invert, or return just zeros as surface
coordinates. So far there is no way to even assign a non-invertible
transformation.
Note, that even 3D projective transformations are just a subset.
Nothing should prevent arbitrary non-linear transformations in the
compositor, if they make any sense. How about projection on a sphere,
like in a planetarium? In case someone wants to implement them in their
compositor, we should not make it impossible, because supporting that
in the core protocol is not a hindrance.
> I was under the impression that Wayland had no such thing as "damage
> regions". The surface belongs to the client, so it does not seem like
> there is any way for it to be "damaged" unless the client does it
> itself?
Damage regions are a Weston internal thing for optimizing the composite
rendering. All that clients can do about it is to issue "in this new
buffer, this is the rectangle that actually changed" (see
wl_buffer::damage in the protocol), and that is in surface local
coordinate system. Weston internal damage tracking is mostly in
absolute display coordinate system, AFAIU.
- pq
More information about the wayland-devel
mailing list