[PATCH libinput 01/11] Add an API for touchpad gesture events
spitzak at gmail.com
Fri Feb 20 12:08:14 PST 2015
On 02/19/2015 04:49 PM, Peter Hutterer wrote:
> unless you have the context you cannot know. and the only thing to have
> that context is the client. sure you can make all sorts of exceptions ("but
> double-tap should always be doubletap") but that just changes the caliber
> you're going to shoot yourself in the foot with.
Okay, but I am still not seeing a clear answer to "why are touch pads
and touch screens different?" Perhaps the client has to do gestures, but
it seems to me that the answer is going to be the same whether the
controlling surface is on the screen surface or not.
The fact that you can point to something on the screen does not seem
enough of a reason. All touch pads I have ever seen have a cursor on the
screen and when the user presses down they expect to be pressing where
that cursor is, thus providing exactly the same level of "context" as a
>> That was probably the wrong term. I don't mean raw data from a device, what
>> I meant was that the client gets the exact same events during a gesture that
>> it would get if the gesture was not recognized, such as press/move/raise
>> type events. It is true they would be deferred until the gesture is
>> recognized, and the compositor will deliver them to the client that gets the
>> gesture, so they are not 100% identical, but they will be useful if new
>> gestures are added so older clients still work.
> no, we tried that in X and it's an unmaintainable mess that has corner cases
> that are most likely unsolvable. sending two event streams for the same
> event is a nightmare.
But in Wayland you don't have to keep back compatibility. The events
would be clearly indicated as belonging together so there is no more
ambiguity about which ones are the same one.
In addition there would be events like "this is part of a foo-gesture
but should be ignored if you are interpreting the foo-gesture".
More information about the wayland-devel