[PATCH libinput 01/11] Add an API for touchpad gesture, events

Kyle Evans kevans at android-x86.org
Sat Feb 21 05:29:54 PST 2015


> Okay, but I am still not seeing a clear answer to "why are touch pads
> and touch screens different?" Perhaps the client has to do gestures, but
> it seems to me that the answer is going to be the same whether the
> controlling surface is on the screen surface or not.
>
> The fact that you can point to something on the screen does not seem
> enough of a reason. All touch pads I have ever seen have a cursor on the
> screen and when the user presses down they expect to be pressing where
> that cursor is, thus providing exactly the same level of "context" as a
> touch screen.

Actually, all touch pads you have ever seen were just configured that 
way by convention. I've been playing around with android on my tablet PC 
and it is possible to use the touchpad in three different modes of 
operation, pointer, touchScreen, & touchPad. Pointer is the conventional 
method, touchScreen means treat the device like a touchscreen (a cursor 
can be enabled, but you don't know where it will be until you touch), 
and touchPad seems to be an unfinished gesture mode (I'm guessing it's 
designed for being able to do things like zoom and scroll, maybe like 
touchScreen but without the "click" gesture).

I use touchScreen because it makes navigation so much nicer and keeps 
the screen clean. If I want to be precise I can use the actual touchscreen.

My two cents,
Kyle


More information about the wayland-devel mailing list