[RFC libinput] Add an API for touchpad gesture events

Hans de Goede hdegoede at redhat.com
Thu Jan 29 00:17:27 PST 2015


Hi,

On 29-01-15 07:11, Peter Hutterer wrote:
> On Wed, Jan 28, 2015 at 04:02:20PM +0100, Hans de Goede wrote:
>> Hi,
>>
>> On 28-01-15 14:02, Carlos Garnacho wrote:
>>> Hey Hans,
>>>
>>> On miƩ, 2015-01-28 at 08:38 +0100, Hans de Goede wrote:
>>
>> <snip>
>>
>>>>> Ok, I'll change this to mm for v3 then (just send v2 addressing the other
>>>>> review comments and adding a core implementation).
>>>>
>>>> So I've been thinking more about this and I'm still not fully convinced, let
>>>> me explain.
>>>>
>>>> The problem is that the edges where a swipe begins and end may vary based
>>>> on config settings and even per swipe.
>>>>
>>>> e.g. If a clickpad is using softbutton areas on the bottom, and the user
>>>> swipes up, the swipe gesture will not start until the fingers have left the
>>>> bottom button area, because before that they are seen as fingers resting to
>>>> click the buttons, so for that gesture the bottom edge would be the top edge
>>>> of the button area.
>>>>
>>>> But if the user swipes down, he can use the bottom button area because fingers
>>>> are only not counted as active there, when they've started there.
>>>>
>>>> Now we could always only count the non soft-button area as the area to which
>>>> absolute gesture coordinates should be compared to see where on the touchpad
>>>> the gesture is, but what if clickfinger is enabled, then there is no soft
>>>> button area?  Then what if the user changes the clickfinger setting ?
>>>>
>>>> A solution to this would be putting the relevant area in the gesture event,
>>>> and documenting that it may differ per event and that users should always
>>>> compare absolute coordinates to the area relevant for that device. But this
>>>> does not sound like a very nice API to me, and also makes me worry about the
>>>> wayland protocol.
>>>
>>> I'm maybe missing something, but how is the area relevant for anything
>>> else than finding out whether a touch is eligible to start a gesture?
>>> After that all that should matter is the relative position of the
>>> involved touches.
>>
>> Quoting from the actual patch you're replying to:
>>
>> "* Return the absolute x coordinate of the current center position of a
>>   * swipe gesture. This can be used e.g. to determine if a swipe is starting
>>   * close to a touchpad edge, or to synchronize an animation with how many
>>   * percent of the width of a touchpad a swipe gesture has traveled."
>>
>> AFAIK mac os x does something like this (animation based on how far along
>> the touchpad a swipe gesture has moved) for some transations, I want to
>> keep this use case open in the API.
>
> to clarify those two, from what I can tell on OS X 10.9.5:
> - there is a "swipe left from the right edge with two fingers" gesture that
> can be selected to show the Notification Center. That's the only one that
> has some spatial arrangement in it.

Interesting, so this does bring us back to 2 fg swipes. I guess that we
could eventually differentiate 2fg events like this:

1) started with the fingers close to each other near an edge, moving away
from the edge at a 90 degree angle: 2fg swipe
2 & 3) else do stuff to differentiate between a scroll and a zoom/rotate

> - the animations aren't tied to the width of the touchpad, the one where the
> animation follows the finger merely uses 2-3 cm of the touchpad while
> pinching. afaict nothing uses the actual width of the touchpad directly.

Ah, good, so lets drop the whole absolute coords thingie then.

> sorry if these weren't clear in the original emails where we discussed this
> note that at least in the system preferences the selection of gestures
> available is relatively small and contained, it's not a free-for-all.
>
> There is one gesture for three-finger dragging but that seems to be the
> equivalent of button-down, pointer motion rather than being a direct mapping
> of the touchpad to the screen area or somesuch.
>
>>>> All in all this sounds very much NOT kiss, and I'm a big fan of kiss, so I
>>>> really believe that the initially proposed 0.0 - 1.0 scale is much better,
>>>
>>> But we get back here to the stretched rectangle case :(. IMO there
>>> should be a way to know about the proper w/h ratios at least.
>>
>> This is highly unlikely to be used in a mode where we're not only interested
>> in either vertical or horizontal movement.
>>
>> Note this is only about swipes, later on a separate gesture event type will
>> be added for zoom/rotate, and there only the deltas matter, and those use
>> a normalized resolution, so that a square gesture on the touchpad results
>> in identical dx and dy events / scales.
>>
>>>> it already gives apps way more info then the current gtk and mac os x api-s
>>>> do, as those both only give deltas, while turning all the above into an
>>>> implentation detail, rather then something which we need to explain in API
>>>> docs. The above is exactly why we want to do the gesture handling inside
>>>> libinput in the first place, and exporting absolute event data in mm, unhides
>>>> all the nitty gritty we're trying to abstract away in the first place.
>>>
>>> I don't think it's incompatible, given that libinput determines first
>>> whether the touch is part of a gesture before letting know of it.
>>
>> Lets say the compositor will see an upward vertical swipe as being relevant
>> if it starts below a line 10% of the touchpad height from the bottom, but we
>> never send events for that because we do not start seeing the fingers as
>> active until the leave the bottom soft button area which is 12% high...
>>
>> As said doing this in mm means that we need to send the relevant bounding
>> box within which the gesture sequence can send coordinates per gesture
>> as it can change based on direction of the gesture (upward vs downward)
>> and on config settings (clickfinger vs softbuttons), this is possible, but
>> does not seem like a nice API to me, also because it is counter intuitive
>> and users may very well get the bounding box once and then store it, as
>> they will expect it to stay the same.
>>
>> The only real argument against the 0.0 - 1.0 scale which I've heard otoh
>> is that it looses aspect ratio info, which tends to be utterly irrelevant
>> for swipe gestures anyways, and if it is relevant then the application
>> can use the delta events, which are aspect ratio correct.
>
> I'm beginning to think that maybe having a set of tags may be better here.
> e.g. left edge, right edge, corner, etc.
> This absolves us from any requirements for precision but still allows some
> use-cases. The only one that won't work is to make things dependent on the
> touchpad width.
>
> Adding an edge flag is easy to do retroactively too, so if the gesture
> starts at the bottom and we don't recognise it as gesture until it goes
> outside of the button area - that still allows us to apply the edge tag
> without messing around with coordinate scales.

I was starting to think that we should not mess with absolute coordinates
at all myself too, since it is a mess :)

I like the flags idea, we could add a started_near_edge bitfield to the
start (*) event with bits for each of the 4 edges, and this field may
be all 0 of course for gestures not started near an edge.

Regards,

Hans



*) Maybe rename start to begin to better match end ?


More information about the wayland-devel mailing list