[PATCH weston] xdg-shell: Make stable
Pekka Paalanen
ppaalanen at gmail.com
Tue Aug 26 00:24:49 PDT 2014
On Mon, 25 Aug 2014 21:51:57 -0700
Jason Ekstrand <jason at jlekstrand.net> wrote:
> Just a couple quick comments below.
>
> I can't fin where this goes, so I'm putting it here: Why are we having
> compositors send an initial configure event again? Given that we have a
> serial, tiling compositors can just send a configure and wait until the
> client responds to the event. Non-tiling compositors will just send 0x0
> for "I don't care" right? I seem to recall something about saving a
> repaint on application launch in the tiling case but I don't remember that
> well. I'm not sure that I like the required roundtrip any more than the
> extra repaint. It's probably a wash.
There is more to configure than just the size, though making the
initial configure event optional, well...
If clients are not required to wait for the initial configure event,
they will draw in whatever state they happen to choose. If the
compositor disagrees, the first drawing will go wasted, as the
compositor will ask for a new one and just not use the first one
because it would glitch.
But as configure is about more than just the size, the compositor
cannot even check if it disagrees. There is no request corresponding to
configure event that says this is the state I used to draw, there is
only ack_configure.
So if you want to make the initial configure event optional, you'll
need more changes to the protocol.
I do think one roundtrip always at window creation is better than a
wasted drawing sometimes, even if you fixed the protocol to not have
the state problem.
We just have to make sure, that any features that require feedback
during window creation can be conflated into the one and the same
roundtrip: client sending a series of requests and a wl_display_sync in
the end, the server replying with possibly multiple (e.g. configure)
events each overriding the previous, and once the sync callback fires,
the latest configure event is the correct one.
So yeah, we could make the initial configure event optional, but we
cannot remove the roundtrip, just in case some compositor actually
wants to do initial configuration efficiently.
That is still worth to consider in the spec; do we require all
compositors to send the initial configure event, or do we only give the
option, and say that clients really should use a wl_display_sync once
they have set up the xdg_surface state?
> On Aug 22, 2014 1:48 PM, "Jasper St. Pierre" <jstpierre at mecheye.net> wrote:
> >
> > On Wed, Aug 6, 2014 at 9:39 AM, Pekka Paalanen <ppaalanen at gmail.com>
> wrote:
> >>
> >> On Thu, 17 Jul 2014 17:57:45 -0400
> >> "Jasper St. Pierre" <jstpierre at mecheye.net> wrote:
> >>
> >> Oh btw. we still don't have the popup placement negotiation protocol
> >> (to avoid popup going off-screen), but the draft I read a long time ago
> >> assumed, that the client knows in advance the size of the popup
> >> surface. We don't know the size here. I'm not sure if solving that
> >> would change something here.
> >
> >
> > The compositor can pivot the menu around a point, which is very likely
> > going to be the current cursor position. Specifying a box would be overall
> > more correct if the popup instead stemmed from a button (so it could appear
> > on all sides of the box) but I don't imagine that clients will ever use
> > this on a button. We could add it for completeness if you're really
> > concerned.
>
> Just thinking out loud here but why not just have the client send a list of
> locations in order of preference. That way the client can make its popups
> more interesting without breaking the world. Also, the client probably
> wants feedback of where ilthe popup is going to be before it renders. I'm
> thinking about GTK's little popup boxes with the little pointer thing that
> points to whatever you gicked to get the menu. (I know they have a name, I
> just can't remember it right now. They're all over the place in GNOME
> shell.)
That brings us back to the original idea of the popup placement
protocol: after a probe or two, the client knows where the popup is
placed and can render accordingly.
> >> > <entry name="fullscreen" value="2" summary="the surface is
> fullscreen">
> >> > The surface is fullscreen. The window geometry specified in
> the configure
> >> > event must be obeyed by the client.
> >>
> >> Really? So, will we rely on wl_viewport for scaling low-res apps to
> >> fullscreen? No provision for automatic black borders in aspect ratio or
> >> size mismatch, even if the display hardware would be able to generate
> >> those for free while scanning out the client buffer, bypassing
> >> compositing?
> >
> >
> >> Since we have a big space for these states, I suppose we could do those
> >> mismatch cases in separate and explicit state variants of fullscreen,
> >> could we not?
> >
> >
> > I explicitly removed this feature from the first draft of the patch
> simply to make my life easier as a compositor writer. We could add
> additional states for this, or break fullscreen into multiple states:
> "fullscreen + size_strict" or "fullscreen + size_loose" or something.
> >
> > I am not familiar enough with the sixteen different configurations of the
> > old fullscreen system to make an informed decision about what most clients
> > want. Your help and experience is very appreciated. I'm not keen to add
> > back the matrix of configuration options.
>
> Yeah, not sure what to do here. I like the idea of the compositor doing it
> for the client. Sure, you could use wl_viewport, subsurfaces, and a couple
> black surfaces for letterboxing. However that is going to be far more
> difficult for the compositor to translate into overlays/planes than just
> the one surface and some scaling instructions.
I don't think it would be that hard for a compositor to use overlays
even then. Have one surface with a 1x1 wl_buffer, scaled with
wl_viewport to fill the screen, and then have another surface on top
with the video wl_buffers being fed in, scaled with wl_viewport to keep
aspect ratio. A compositor can easily put the video on an overlay, and
if the CRTC hardware supports, it might even eliminate the black surface
and replace it with a background color.
In my previous reply, I concluded that wl_viewport+wl_subsurface would
be enough (I suprised myself), and we would not really need yet another
way from xdg_surface. Obviously, I forgot something, that the IRC
discussion of last night brought back to my mind.
The wl_shell fullscreening has three different cases for dealing with
size mismatch between the output and the window:
- aspect-correct scaling
- centered, no scaling
- please, I would really like a mode switch
There is also the fourth, which means the client does not care at all
what the compositor does. This protocol was designed before
sub-surfaces or wl_viewport were a thing.
If we do not have that in xdg_shell, but instead rely on
wl_viewport+wl_subsurface, there are two consequences:
- scaling vs. centered is no longer a hint, but it is dictated by the
client
- the mode switch case is lost
(- all desktop compositors are required to implement both wl_scaler
and wl_subcompositor)
The first point is likely not an issue, but the second may very well
be. If xdg_surface requires fullscreen windows to be the exact output
size, and clients obey, there is no way to trigger the mode switch.
I suspect there are at least gamers out there, who would not like this.
In fact, they would be pushed back to the way X11 works right now: if
you want a mode switch, use a helper app to permanently change the
output resolution (this screws up your whole desktop layout), then
launch the game, afterwards do a manual switch back.
No, sorry, that would actually be *worse* than the situation on X11
right now.
Recalling that, I do think we need something to support the mode switch
case. A crucial part of a video mode for a gamer is the monitor refresh
rate, which is why wl_shell_surface.set_fullscreen includes the
framerate parameter.
Also remember, that the mode switch is not meant to be mandatory if
requested by a client. It is only a preference, that when this app is
active (and top-most?), it would really like the video mode to be
changed to the nearest compatible with the window. The compositor is
free to switch between the game's mode and the native desktop mode at
will. Minimize the game? Sure, switch back to the native desktop video
mode. Bring the game back - switch back to the game mode. Alt+tab?
Switch only when something else than the game gets activated, maybe.
The axiom here is that people (e.g. gamers) sometimes really want to
run an application on a different video mode than the normal desktop.
It has been true in the past, can anyone claim it is not true nowadays?
Or can anyone claim these people's use cases do not matter enough?
Giulio also had some reasons to prefer the wl_shell way that he
mentioned in IRC. Giulio, could you elaborate here?
Thanks,
pq
More information about the wayland-devel
mailing list