[PATCH wayland] protocol: define further the behavior of input on the presence of grabs

Peter Hutterer peter.hutterer at who-t.net
Tue Jul 14 19:35:23 PDT 2015

late to the party, but anyway:

On Tue, Jun 09, 2015 at 07:30:31PM +0200, Carlos Garnacho wrote:
> A good piece of this commit is actually defining as protocol
> documentation the de-facto behavior in weston and other compositors.
> Additionally, a wl_seat.release_input event is being added, and
> the cases where it should be emitted has been added in the relevant
> places. In short, the compositor should emit it on all surfaces
> where there was any input interaction that is being redirected
> away by any means, although some degree of sloppyness is allowed
> here, and clients are recommended to expect it on surfaces there's
> nothing really going on.
> With this event in place, we can define further how do the
> different input interfaces deal at either side of the input
> redirection induced by grabs. Taking this out of "undefined" or
> "compositor dependent" land is the final goal of this commit.

I'm a bit worried you're conflating multiple issues that have some overlap
but aren't the same.

all the touch issues to me sound like focus management issues to me.
there's one use-case that I thought about and that's the one where you
interact with an application with one finger held down, trigger an action
you _know_ will cause a popup, then use a finger to ack the popup while
holding the original finger down, continuing the original action.

a rough idea:
that use-case would call for touch focus events.  a touch-focus-out signals
"touch still active but not yours". the compositor may discard any
events while out of focus (but doesn't have to) and can send the enter event
when the touch comes back. since usually we assume touch==focus you wouldn't
send the enter event by default, only if the touch started elsewhere or it's
a returning touch.

the client can keep the state until the touch is ended or cancelled, or
resume from that state once the touch is focused again.

> Signed-off-by: Carlos Garnacho <carlosg at gnome.org>
> ---
>  protocol/wayland.xml | 77 +++++++++++++++++++++++++++++++++++++++++++++++++++-
>  1 file changed, 76 insertions(+), 1 deletion(-)
> diff --git a/protocol/wayland.xml b/protocol/wayland.xml
> index c3b8ae4..3017ff0 100644
> --- a/protocol/wayland.xml
> +++ b/protocol/wayland.xml
> @@ -668,6 +668,16 @@
>  	wl_surface is no longer used as the icon surface. When the use
>  	as an icon ends, the current and pending input regions become
>  	undefined, and the wl_surface is unmapped.
> +
> +	After this call, the compositor will consume events from all
> +	input capabilities, compositors are free to implement additional
> +	behavior with other input than the pointer/touch sequence driving
> +	the drag-and-drop operation.
> +
> +	This request should trigger the emission of seat.release_input
> +	events on, at least, the surfaces that are currently being
> +	interacted with, the surface passed in the origin argument
> +	is implicitly included there.
>        </description>
>        <arg name="source" type="object" interface="wl_data_source" allow-null="true"/>
>        <arg name="origin" type="object" interface="wl_surface"/>
> @@ -890,6 +900,9 @@
>  	This request must be used in response to a button press event.
>  	The server may ignore move requests depending on the state of
>  	the surface (e.g. fullscreen or maximized).
> +
> +	This request will trigger the emission of seat.release_input
> +	events on all interacted surfaces, including this one.
>        </description>
>        <arg name="seat" type="object" interface="wl_seat" summary="the wl_seat whose pointer is used"/>
>        <arg name="serial" type="uint" summary="serial of the implicit grab on the pointer"/>
> @@ -901,6 +914,9 @@
>  	is being dragged in a resize operation. The server may
>  	use this information to adapt its behavior, e.g. choose
>  	an appropriate cursor image.
> +
> +	This request will trigger the emission of seat.release_input
> +	events on all interacted surfaces, including this one.
>        </description>
>        <entry name="none" value="0"/>
>        <entry name="top" value="1"/>
> @@ -1478,7 +1494,7 @@
>      </request>
>     </interface>
> -  <interface name="wl_seat" version="4">
> +  <interface name="wl_seat" version="5">
>      <description summary="group of input devices">
>        A seat is a group of keyboards, pointer and touch devices. This
>        object is published as a global during start up, or when such a
> @@ -1549,6 +1565,34 @@
>        <arg name="name" type="string"/>
>      </event>
> +    <!-- Version 5 additions -->
> +    <event name="release_input">
> +      <description summary="release all input">
> +	This events notifies that the given surface will temporarily or
> +	permanently stop receiving input from the given capabilities, so
> +	it should prepare to undo any interaction with these.
> +
> +	The situations where this event may be emitted are variated, some
> +	examples are:
> +	- When a popup is shown by this or other client.
> +	- When a drag-and-drop operation is initiated from this or
> +	  any other surface.
> +
> +	The common denominator in these situations is that input is being
> +	redirected partly or entirely somewhere else, so this client
> +	should forget about any current interaction, for example:
> +	- Unset key repeat timeouts
> +	- Undo the effect of pressed pointer buttons
> +	- Cancel ongoing touch sequences
> +
> +	Smart compositors will only send this event to surfaces being
> +	currently focused/interacted, clients should prepare for receiving
> +	this on surfaces that aren't currently being interacted with,
> +	nonetheless.
> +      </description>
> +      <arg name="surface" type="object" interface="wl_surface"/>
> +      <arg name="capabilities" type="uint"/>

honestly, I think this is too generic and a
one-size-doesnt-quite-fit-anybody :)

it already doesn't apply to wl_keyboard because that's where the focus
handling works fine apparently.

> +    </event>
>    </interface>
>    <interface name="wl_pointer" version="3">
> @@ -1561,6 +1605,22 @@
>        events for the surfaces that the pointer is located over,
>        and button and axis events for button presses, button releases
>        and scrolling.
> +
> +      In the time between a button press and a button release, the
> +      pointer can be considered "implicitly grabbed" on the surface:
> +      motion events will be received even if the pointer moves out
> +      of the surface, and the leave event can only happen after
> +      the button release if it happened outside the surface.
> +
> +      It can be the case that input is actively redirected somewhere
> +      else while there is such implicit grab in effect (popups are
> +      one such case). If this happens, a seat.release_input event is
> +      expected on the first surface, the implicit grab will be
> +      this way considered broken.
> +
> +      The new grabbing surface will not be implicitly grabbed, it
> +      will be nonetheless able to receive pointer.enter, pointer.motion
> +      and the eventual pointer.button with "released" state.

what you need here is a button cancelled event (or state?).

>      </description>
>      <enum name="error">
> @@ -1665,6 +1725,14 @@
>  	enter event.
>          The time argument is a timestamp with millisecond
>          granularity, with an undefined base.
> +
> +        Clients should note that pressed/released events may not be
> +        paired, most commonly due to input being actively taken away
> +        or being redirected into this surface (eg. popups), either
> +        state must be expected to be received separately. If no such
> +        input redirection happened in between, the same surface that
> +        received the "pressed" state is expected to receive the
> +        "released" state too.

I'm assuming this is the key event, the context is a bit tricky here :)
the focus handling is already working, so no need for the input_release.

>        </description>
>        <arg name="serial" type="uint"/>
> @@ -1844,6 +1912,13 @@
>        with a down event, followed by zero or more motion events,
>        and ending with an up event. Events relating to the same
>        contact point can be identified by the ID of the sequence.
> +
> +      For all client purposes, events for individual IDs are always
> +      expected be received on the same surface, When input is
> +      redirected somewhere else, the affected touch contacts should
> +      be considered lost, even if the client owns the surface
> +      it is being redirected to (eg. a popup). This can be known
> +      through the seat.release_input event.

this is a focus issue, as mentioned above.

for tablets:
you need two different paths I think: one where a button is down and gets
cancelled on dnd, similiar to the pointer button cancel above. the other one
where the pen touches the surface which is closer to the touch focus issue.

for buttonsets:
no idea, actually :) since they're defined as "non-pointer moving devices"
it's a bit tricky what should happen here. I'd almost go for a keyboard-like
focus behaviour for button (press + focus leave may happen, deal with
it) and touch focus-like semantics for the special axes.


>      </description>
>      <event name="down">
> -- 
> 2.4.2
> _______________________________________________
> wayland-devel mailing list
> wayland-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel

More information about the wayland-devel mailing list