gsoc 2013 idea - Customizable gestures
peter.hutterer at who-t.net
Sun Apr 14 03:30:16 PDT 2013
On 13/04/13 01:34 , Michal Suchanek wrote:
> On 12 April 2013 13:50, Alexander E. Patrakov <patrakov at gmail.com> wrote:
>> 2013/4/12 Peter Hutterer <peter.hutterer at who-t.net>:
>>> Hi guys,
>>> Unfortunately, the entry for gesture recognition in the synaptics driver
>>> should have not been on the list. synaptics is the wrong place in the stack
>>> to do gesture recognition. we support a minimal set of gestures and they
>>> already give us more headache than benefit. full gesture recognition in the
>>> synaptics driver would be an unmaintainable nightmare. for that reason, even
>>> if you could get it to work in a proof-of-concept I would not merge the
>>> result into the upstream driver.
>> I can understand this position. However, this also poses a question:
>> what counts as a gesture and what doesn't. E.g., on a clickpad, one
>> can click in the bottom right part of the pad in order to get this
>> recognized as a "right button click". Or, one can swipe along the
>> right edge in order to scroll. Are these two examples gestures, or
> They are synaptics-specific gestures. There is no reason why any other
> absolute input device could not make such gestures available.
In theory that is correct, but that we don't have the infrastructure to
share these in place. Largely for historical reasons and because it too
would make maintenance harder - another API to track. There's additional
subtleties such as different capabilities, backwards compatibility to
existing gestures, etc. We don't have a decent way to configure these
either, properties help but are somewhat a rough way to control them.
Next thing is that there is some interaction between the gestures is
complicated. All of this is not impossible to fix, it's just hard.
the gestures we do have also (mostly) translate into other pointer
events. scrolling and tapping both translate to mouse button events.
true gestures aren't that simple, a pinch gesture may be zoom, have a
rotation component, etc. so true gestures are difficult in the driver
where we have no context.
> I would
> gladly turn off multitouch gestures and replace them with these more
> usable synaptics gestures on my wacom tablet.
> It's true that gestures are usually understood in relative sense - eg.
> left ro right two-finger swipe in any part of the touch surface. But
> absolute gestures that are performed on a particular part of the touch
> surface are required to support devices with touch buttons (some iPad
> like tablets) and legacy synaptics behaviour.
More information about the xorg-devel