[PATCH libinput 01/11] Add an API for touchpad gesture events
Peter Hutterer
peter.hutterer at who-t.net
Thu Feb 19 16:49:03 PST 2015
On Thu, Feb 19, 2015 at 01:15:31PM -0800, Bill Spitzak wrote:
> I think I'm not explaining my question right.
>
> I fully think it is correct for libinput to do gesture recognition.
>
> My question is why you think this should not be done for touch screens.
>
> I think it should be done for them, and for every other input device in the
> world (including mundane things like detecting double-click on a mouse,
> differentiating click from drag, signals when the user "hovers", or doing
> the repeat of keyboard keys).
>
> Maybe I misread your text, but it basically sounded like "gestures on a
> touch screen must be done by clients". That seems wrong since you have
> already concluded this is not right for other touch devices.
on a direct touch device you need context to detect what a specific gesture
is. one example:
you have two touches on your screen. The two touches move diagonally towards
each other for some distance. Is this:
a) a pinch-to-zoom gesture
b) moving two objects on a widget towards each other (e.g. two waypoints on
a map)
c) moving two windows of different clients towards each other
unless you have the context you cannot know. and the only thing to have
that context is the client. sure you can make all sorts of exceptions ("but
double-tap should always be doubletap") but that just changes the caliber
you're going to shoot yourself in the foot with.
> On 02/19/2015 12:22 AM, Peter Hutterer wrote:
>
> >raw events (I'm going to assume you mean touch events) have no meaning..
>
> That was probably the wrong term. I don't mean raw data from a device, what
> I meant was that the client gets the exact same events during a gesture that
> it would get if the gesture was not recognized, such as press/move/raise
> type events. It is true they would be deferred until the gesture is
> recognized, and the compositor will deliver them to the client that gets the
> gesture, so they are not 100% identical, but they will be useful if new
> gestures are added so older clients still work.
no, we tried that in X and it's an unmaintainable mess that has corner cases
that are most likely unsolvable. sending two event streams for the same
event is a nightmare.
Cheers,
Peter
> >we've had ubiquitous gestures for years now and we've learned one thing:
> >there is no use-case for gestures beyond the few that we'll provide anyway
> >(tap, swipe, pinch, rotate). providing the framework to support more than
> >that is pointless. And we're trying to provide a useful input stack, not one
> >that requires you to choose which way your touchpad won't work.
>
> That could be an explanation: if you really believe the set of gestures will
> never expand, then raw events are unnecessary, since all wayland clients
> will handle all possible gestures.
>
> >libinput is not in the business of focus handling, that's the compositor.
>
> Yes, I was writing this based on that assumption.
More information about the wayland-devel
mailing list