[PATCH libinput 01/11] Add an API for touchpad gesture events
Bill Spitzak
spitzak at gmail.com
Wed Feb 18 11:36:18 PST 2015
On 02/18/2015 04:26 AM, Hans de Goede wrote:
> For touchscreens we always send raw touch events to the compositor, and the
> compositor or application toolkits do gesture recognition. This makes sense
> because on a touchscreen which window / widget the touches are over is
> important context to know to interpret gestures.
>
> On touchpads however we never send raw events since a touchpad is an absolute
> device which primary function is to send pointer motion delta-s, so we always
> need to do processing (and a lot of it) on the raw events.
I don't quite understand this distinction. I can certainly imagine
touchpad gestures that depend on where the cursor is currently.
Conversely I can imagine users wanting full-screen gestures on their
touchscreen, for instance a fast swipe should not be sent to the widget
the down event is at, but to where the cursor was before the swipe was
started.
I think what is needed is for gesture recognition to be in libinput but
also send the raw events. Then a program can ignore the gesture if it
wants, and this also allows new gestures to be added without making
older programs freeze when you attempt them. (for an implementation I
would add a field to events to indicate if they are the last in a set).
The only effect of gesture recognition that is unavoidable is that the
focus may not match the location of the cursor, in that the raw events
are sent to the same target that the gesture is being sent to. I think
this is a necessary ability to make the gestures consistent.
More information about the wayland-devel
mailing list