[PATCH libinput] Add an API for touchpad gesture events
Hans de Goede
hdegoede at redhat.com
Wed Feb 18 04:24:22 PST 2015
Hi,
On 17-02-15 22:55, Peter Hutterer wrote:
> On Thu, Feb 12, 2015 at 12:27:49PM +0100, Hans de Goede wrote:
>> For touchscreens we always send raw touch events to the compositor, and the
>> compositor or application toolkits do gesture recognition. This makes sense
>> because on a touchscreen which window / widget the touches are over is
>> important context to know to interpret gestures.
>>
>> On touchpads however we never send raw events since a touchpad is an absolute
>> device which primary function is to send pointer motion delta-s, so we always
>> need to do processing (and a lot of it) on the raw events.
>>
>> Moreover there is nothing underneath the finger which influences how to
>> interpret gestures, and there is a lot of touchpad and libinput configuration
>> specific context necessary for gesture recognition. E.g. is this a clickpad,
>> and if so are softbuttons or clickfinger used? What is the size of the
>> softbuttons? Is this a true multi-touch touchpad or a semi multi-touch touchpad
>> which only gives us a bounding box enclosing the fingers? Etc.
>>
>> So for touchpads it is better to do gesture processing in libinput, this commit
>> adds an initial implementation of a Gesture event API which only supports swipe
>> gestures, other gestures will be added later following the same model wrt,
>> having clear start and stop events and the number of fingers involved being
>> fixed once a gesture sequence starts.
>>
>> Signed-off-by: Hans de Goede <hdegoede at redhat.com>
>
> looks good to me, one minor change below:
>
>> + * Gesture sequences always start with a LIBINPUT_EVENT_GESTURE_FOO_START
>> + * event. All following gesture events will be of the
>> + * LIBINPUT_EVENT_GESTURE_FOO type until a LIBINPUT_EVENT_GESTURE_FOO_END is
>
> LIBINPUT_EVENT_GESTURE_FOO -> LIBINPUT_EVENT_GESTURE_FOO_UPDATE
Good one, fixed in the version which I'm about to send out.
> one other comment: I think we should make gestures a capability. the API is
> separate to pointer interface and it's not clear which device can send
> gestures by just looking at the pointer interface. the simple change to make
> this a capability resolves that issue.
Good idea, I've added a patch for this to the patch-set which I'm about to
send out.
Regards,
Hans
More information about the wayland-devel
mailing list