[PATCH wayland v3] protocol: Add minimize/maximize protocol
Pekka Paalanen
ppaalanen at gmail.com
Tue Mar 19 01:49:06 PDT 2013
Hi,
just a fly-by comment from the sickbed here (well, recovering already).
On Mon, 18 Mar 2013 14:24:17 -0600
Scott Moreau <oreaus at gmail.com> wrote:
> Yes and again, I would like to thank you for taking the time out to
> address this. I now have a couple of other outstanding cases I would
> like to introduce to the discussion. These are cases to consider when
> thinking about what descriptions should be used for the minimize
> protocol documentation.
>
> 1) We want live minimized surface previews. This means that we want to
> be able to display a minimized surface in several cases and have the
> client still rendering frames of whatever it draws. For example, a
> hover action on the panel's window list item could pop-up a scaled
> down preview of the surface. Another use case is for 'scale' (compiz)
> or 'expose' (osx) effect. We want to be able to view minimized surface
> in this mode as well. There are plenty of other use cases as well. The
> question is, how should the semantics work here? Should the client
> become unminimized for this case? Or should the client do nothing but
> just always keep rendering and attaching frames even in the minimized
> case?
>
> 2) Minimized surfaces do not receive frame events. The current
> implementation removes the surface from the compositor render list of
> surfaces to draw. This means that a) the surface has no assigned
> output and b) does not get sent frame events while minimized. I
> thought about this and ended up with a separate minimized surface list
> that contains the list of currently minimized surfaces. Should output
> repaint send frame events to minimized surfaces as well?
>
I don't think you should tie minimized state to the frame event
processing at all. Currently that is just a by-product of Weston
implementation details. To me, these two are completely orthogonal
concepts.
You will have protocol to set a surface minimized or normal. Minimized
probably means, that the surface is not shown as it normally is. It
could be scaled down to a stamp somewhere, not shown at all, or
whatever. I didn't check how you defined what minized means, or if you
even did.
Frame events are tied to the surface being visible. If a compositor
decides to show a surface, in minimized state or not, show it in a
normal way, scaled-down way, or in any way it actually is visible, it
should emit frame events accordingly.
The question "should the client become unminized?" in 1) becomes moot
with this definition, since minimized state would not exclude
visibility.
My answer to the question of 2) is "yes, as usual". If a surface is
visible, it is being repainted, hence frame events should fire.
By "visible" I really mean on the pixel level: if no pixels of a surface
have a chance to affect the compositor output, the surface is not
visible. It's not simply a question of whether a weston_surface is on
some list or has output assigned, occlusions and transformations count
too. I don't recall how far this is implemented in Weston, though, but
this is how I understand the idea of frame events.
As such, clients should not make too much assumptions on when frame
events will occur or not. This is especially important for soft
real-time applications I discuss below.
> 3) Currently, most all wayland clients depend on frame events to drive
> their rendering. If the clients are not sent these frame events, the
> client rendering will stall. This causes problems for video players
> reliant on the frame event. Should we expect all clients to render not
> depending on the frame events? It is not clear from the frame event
> protocol description. if this is the case, we should add it to the
> protocol description and change the clients accordingly. As I
> understand, eglSwapBuffers() automatically blocks on each call to sync
> properly. So even if you called it at the 'wrong' time (i.e. outside
> of the frame event handler) it would still sync in the egl case. I'm
> not sure about how shm would work here.
I believe video players should be like any real-time presentation
application, including games, when they use the standard wl_surface
attach/damage/commit sequence to update the window. You send new frames
to the server only when new frame events come, unless you want to
minimize latency by triple-buffering in the client. While waiting for
frame events, you continue the real-time processing regardless: games
run their physics and controls etc., video players continue decoding
the video at the required framerate and produce audio. The
sychronization of e.g. video to audio must be adjusted continuously
like always, and if the application wants, it can try to predict the
video latency from the frame event timestamps.
Whether a stalling frame callback should pause the real-time processing
is application specific. Usually the event that pauses the real-time
processing is something else, like a key-press or minimizing a video
player. But that is up to the application, it might want to continue
playback regardless, the audio is probably heard anyway. Decoded video
frames are just thrown away instead of sent to the server, since it's
hard to skip decoding. Games would preferrably skip calling the
rendering routines.
Btw. eglSwapBuffers behaviour will become adjustable, if it is not
already, by implementing eglSwapInterval in the EGL Wayland platform.
That will allow also EGL apps to render continuously, regardless of
frame events. Or rather, allow EGL apps to explicitly sync to frame
events themselves when they want to.
Thanks,
pq
More information about the wayland-devel
mailing list