[PATCH libinput v2 0/4] touchpad gestures support

Carlos Garnacho carlosg at gnome.org
Fri Apr 24 05:39:07 PDT 2015


On Fri, Apr 24, 2015 at 1:49 PM, Hans de Goede <hdegoede at redhat.com> wrote:
> Hi,
>
> On 24-04-15 13:39, Carlos Garnacho wrote:
>>
>> Hey Hans :),
>>
>> On jue, 2015-03-26 at 10:04 +0100, Hans de Goede wrote:
>>>
>>> Hi All,
>>>
>>> Here is v2 of my touchpad gestures support patch series, changes
>>> since v1:
>>> - Merge the gesture capability flag and event-debug patched into
>>>    the "touchpad: Add an API for touchpad gesture events"
>>> - Update the swipe and pinch/rotate implementations to work with the
>>> new
>>>    typesafe device and normalized coordinates (rebased on latest
>>> master),
>>>    this results in a nice cleanup
>>
>>
>> During the last days I played with these last patches on Clutter, and
>> found the gestures to work nicely. I gathered some observations though:
>>
>>        * Higher level gesture implementations usually have a "cancel"
>>          signal/vfunc, it feels kind of weird not being able to fully
>>          map the behaviors. I thought about some "this gesture maybe
>>          continues onto something else" flag in *_END events, although
>>          I see this working better on touchpoints being added than
>>          lifted, given how things seem to work. Anyway, just a remark,
>>          I see cancellation being more important in the protocol
>>          additions this would bring, when this involves client vs
>>          compositor handling.
>
>
> As discussed at Fosdem, adding a flag to indicate of the gesture
> ended normally (all fingers lifted) or by transitioning into something
> else is easy code wise. If you see a need for that I can do so for v3
> of the patch-set.

That would be great, I guess my concern is mostly being consistent
with wl_touch.cancel, and being able to undo ongoing actions in
clients if the compositor is taking over.

>
>>        * I think it is a bit unfortunate that distance in pinch
>>          gestures is relative. Higher level equivalent "zoom" gestures
>>          usually offer absolute scale values (taking the initial
>>          distance as 1x). We do lack some information there to
>>          translate from one to the other, I guess either knowing the
>>          initial distance (does pointer acceleration apply here?), or
>>          offering similar info right away would be nice.
>
>
> I've been ping-ponging between giving deltas or an absolute distance
> in mm. I can change the patch-set to report a distance in mm instead
> of delta from the previous distance, with the caveat that on some
> touchpads we do not know the resolution so the unit there will
> not be mm, but instead unknown stepsize.

Hmm, that's indeed a drawback. I guess having an additional
libinput_event_gesture_get_scale() is not as stateless as you'd want?
If we have to choose between deltas or abs distance, I feel the latter
is most useful though...

>
>>        * From trying things out, I found that if I pinch the fingers
>>          too close (X1 Carbon 1st gen here, 2fg touchpad), and then
>>          move them together around, the pinch gesture would report
>>          distance jumps (self-cancelled overall though), In evemu
>>          -record traces I see 2 separate tracking IDs, and didn't seem
>>          to see such wild coordinate jumps, so I wonder if we can do
>>          something about it, I can investigate further about it if you
>>          want.
>
>
> The distance events are pretty raw (unfiltered) what you're seeing can
> happen if we get touch1 moves, sync, touch2 moves, sync. Are you seeing
> that in the evemu-record traces?

Yeah indeed, I see that happening sometimes. I thought the last
state/values for the unchanged touchpount would apply then.

> I can try to add some averaging to the
> distance like we do with pointer move events, or I can keep giving you
> the raw events and you can filter yourself ...
>
> Downside of pushing the filtering into the toolkit is that each toolkit
> needs to redo it, so I guess it would be better to do it at the
> libinput level.

I'm much undecided myself on that regard too. Raw touchpoints would
map much better to the way gesture implementations out there already
work, I'm finding myself adding internal interfaces and doing
touch-tracking vs gesture-event parallel handling in those, which is
feasible, but kind of awkward. I guess distance averaging is less
intrusive to the way you designed this.

Cheers,
  Carlos


More information about the wayland-devel mailing list