[PATCH] Add touch events to protocol.

laszlo.p.agocs at nokia.com laszlo.p.agocs at nokia.com
Fri Jul 8 05:29:08 PDT 2011


Hi Chase,

True, some additional properties, like pressure and area, are missing however this is intentional as the aim of the patch was to introduce the basics of touch event handling with a minimal set of events and parameters.

The problem with the extra touch properties is that they may not be available or may not be used on some systems and forcing the transmission of e.g. a fake pressure value for every single point feels a waste of bandwidth in these cases.

These properties should somehow be optional in the touch point event (e.g. by having an optional map of values like you mentioned) but I am not quite sure how that could be achieved. Suggestions are welcome.

Regards,
Laszlo

-----Original Message-----
From: ext Chase Douglas
Sent:  05/07/2011, 19:08
To: Agocs Laszlo.P (Nokia-MP-Qt/Tampere)
Cc: wayland-devel at lists.freedesktop.org
Subject: Re: [PATCH] Add touch events to protocol.

Hi Laszlo,

On 06/29/2011 07:54 AM, Laszlo Agocs wrote:
>
>  From f656362511e2622e3cde6062e156b59a83b50e03 Mon Sep 17 00:00:00 2001
> From: Laszlo Agocs <laszlo.p.agocs at nokia.com>
> Date: Wed, 29 Jun 2011 17:51:29 +0300
> Subject: [PATCH] Add touch events to protocol.
>
> ---
>   protocol/wayland.xml |   39 +++++++++++++++++++++++++++++++++++++++
>   1 files changed, 39 insertions(+), 0 deletions(-)
>
> diff --git a/protocol/wayland.xml b/protocol/wayland.xml
> index fd54245..874fd5f 100644
> --- a/protocol/wayland.xml
> +++ b/protocol/wayland.xml
> @@ -461,6 +461,45 @@
>         <arg name="surface" type="object" interface="wl_surface"/>
>         <arg name="keys" type="array"/>
>       </event>
> +
> +    <!-- A notification that is to be sent at least once to each
> +         client, defining the range of coordinates used by the touch
> +         device. -->
> +    <event name="touch_configure">
> +      <arg name="min_x" type="int" />
> +      <arg name="max_x" type="int" />
> +      <arg name="min_y" type="int" />
> +      <arg name="max_y" type="int" />
> +    </event>
> +
> +    <event name="touch_down">
> +      <arg name="time" type="uint"/>
> +      <arg name="id" type="int" />
> +      <arg name="x" type="int" />
> +      <arg name="y" type="int" />
> +    </event>
> +
> +    <event name="touch_up">
> +      <arg name="time" type="uint"/>
> +      <arg name="id" type="int" />
> +    </event>
> +
> +    <event name="touch_motion">
> +      <arg name="time" type="uint"/>
> +      <arg name="id" type="int" />
> +      <arg name="x" type="int" />
> +      <arg name="y" type="int" />
> +    </event>

What about pressure or shape? I don't know the wayland protocol yet, but
is it possible to send a map of properties and values sort of like
valuators?

> +
> +    <!-- Indicates the end of a contact point list. -->
> +    <event name="touch_frame">
> +    </event>
> +
> +    <!-- Sent if the compositor decides the touch stream is a global
> +         gesture. No further events are sent to the clients from that
> +         particular gesture. -->
> +    <event name="touch_cancel">
> +    </event>
>     </interface>

I never understood the concept behind "touch cancel". If I'm a client,
I'm supposed to sit around waiting for a touch cancel event at any
point? Is it bounded in time, or could I get a touch cancel event 20
seconds into a stream of motion events? I don't see a way to get around
explicitly conferring whether a touch  is "accepted" or "rejected"
(implicitly or explicitly) at some point, which is what XInput 2.1 is
aiming to do. However, the exact mechanisms may be different in Wayland
since we don't have the old X cruft to deal with.

Thanks!

-- Chase


More information about the wayland-devel mailing list