[PATCH] Add touch events to protocol.

Chase Douglas chase.douglas at canonical.com
Thu Jul 7 09:12:04 PDT 2011


On 07/07/2011 07:28 AM, Kristian Høgsberg wrote:
> On Tue, Jul 5, 2011 at 1:08 PM, Chase Douglas
> <chase.douglas at canonical.com> wrote:
>> Hi Laszlo,
>>
>> On 06/29/2011 07:54 AM, Laszlo Agocs wrote:
>>>
>>>  From f656362511e2622e3cde6062e156b59a83b50e03 Mon Sep 17 00:00:00 2001
>>> From: Laszlo Agocs <laszlo.p.agocs at nokia.com>
>>> Date: Wed, 29 Jun 2011 17:51:29 +0300
>>> Subject: [PATCH] Add touch events to protocol.
>>>
>>> ---
>>>   protocol/wayland.xml |   39 +++++++++++++++++++++++++++++++++++++++
>>>   1 files changed, 39 insertions(+), 0 deletions(-)
>>>
>>> diff --git a/protocol/wayland.xml b/protocol/wayland.xml
>>> index fd54245..874fd5f 100644
>>> --- a/protocol/wayland.xml
>>> +++ b/protocol/wayland.xml
>>> @@ -461,6 +461,45 @@
>>>         <arg name="surface" type="object" interface="wl_surface"/>
>>>         <arg name="keys" type="array"/>
>>>       </event>
>>> +
>>> +    <!-- A notification that is to be sent at least once to each
>>> +         client, defining the range of coordinates used by the touch
>>> +         device. -->
>>> +    <event name="touch_configure">
>>> +      <arg name="min_x" type="int" />
>>> +      <arg name="max_x" type="int" />
>>> +      <arg name="min_y" type="int" />
>>> +      <arg name="max_y" type="int" />
>>> +    </event>
>>> +
>>> +    <event name="touch_down">
>>> +      <arg name="time" type="uint"/>
>>> +      <arg name="id" type="int" />
>>> +      <arg name="x" type="int" />
>>> +      <arg name="y" type="int" />
>>> +    </event>
>>> +
>>> +    <event name="touch_up">
>>> +      <arg name="time" type="uint"/>
>>> +      <arg name="id" type="int" />
>>> +    </event>
>>> +
>>> +    <event name="touch_motion">
>>> +      <arg name="time" type="uint"/>
>>> +      <arg name="id" type="int" />
>>> +      <arg name="x" type="int" />
>>> +      <arg name="y" type="int" />
>>> +    </event>
>>
>> What about pressure or shape? I don't know the wayland protocol yet, but
>> is it possible to send a map of properties and values sort of like
>> valuators?
>>
>>> +
>>> +    <!-- Indicates the end of a contact point list. -->
>>> +    <event name="touch_frame">
>>> +    </event>
>>> +
>>> +    <!-- Sent if the compositor decides the touch stream is a global
>>> +         gesture. No further events are sent to the clients from that
>>> +         particular gesture. -->
>>> +    <event name="touch_cancel">
>>> +    </event>
>>>     </interface>
>>
>> I never understood the concept behind "touch cancel". If I'm a client,
>> I'm supposed to sit around waiting for a touch cancel event at any
>> point? Is it bounded in time, or could I get a touch cancel event 20
>> seconds into a stream of motion events? I don't see a way to get around
>> explicitly conferring whether a touch  is "accepted" or "rejected"
>> (implicitly or explicitly) at some point, which is what XInput 2.1 is
>> aiming to do. However, the exact mechanisms may be different in Wayland
>> since we don't have the old X cruft to deal with.
> 
> Yes, any stream of touch event, no matter how long, can be cancelled
> by the compositor at any time.  The use cases are when the compositor
> recognizes a global gesture (three finger pinch to go to home screen
> or so) or if an important system event happens (incoming phone call or
> such).
> 
> Whether and how to undo the effects of the touch events up until the
> cancel is an application policy decision.  If you're just scrolling a
> browser window, it probably doesn't make sense to undo the scrolling.
> Even if you're, say, painting in a paint app, it probably makes sense
> to just treat cancel as end and commit the paint action, provided the
> user can just undo that if necessary.

I think undoing automatically would be better, but I'm not a paint app
developer, and it's beside the point :).

> In general, undoing the touch stream is similar to undo, which most
> apps are already handling.  From an application point of view, if you
> have to handle unowned events (that's the XI 2.1 term, right?) for a
> small initial window (100ms, 3s or a week), you end up with the same
> application side complexity, and I don't see what it is apps can do
> differently once they know that the remaining event stream belongs to
> them.

Lets say the WM responds to a short flick gesture. The user performs the
entire gesture before the WM has decided whether it matches the flick
pattern. Now we've sent the touch stream in its entirety to the wayland
client, and we have to cancel it after the fact.

This leaves the client in a precarious situation. It has received the
entire stream of touch events for a touch, but it doesn't yet know if
the stream might be cancelled or not. The client may be buffering the
application state so it can undo changes if it receives a touch cancel.
We don't want to force it to buffer state changes forever, so we must
provide some guidance to developers on how to handle this situation. Do
we say that after 1 s the client can assume the touch stream is theirs?
That's hackish and may produce annoying side effects if the machine is
swapping and goes "dead" for a few seconds.

With XI 2.1 we get around this issue by leaving touches as unowned until
a client explicitly accepts the events. An entire touch stream from
beginning to end may exist in the unowned state, and then once a client
accepts or rejects it the rest of the clients get notified so they can
determine what to do.

-- Chase


More information about the wayland-devel mailing list