Weston multitouch support?

Shawn Rutledge shawn.t.rutledge at gmail.com
Tue Jun 3 03:25:16 PDT 2014


On 3 June 2014 01:25, Peter Hutterer <peter.hutterer at who-t.net> wrote:
> On Mon, Jun 02, 2014 at 12:45:51PM +0100, José Expósito wrote:
>> Hi Peter,
>>
>> I have checked the libinput implementation and, correct me if I'm wrong, I
>> have seen that 2 fingers click is interpreted as right click, 3 fingers
>> click is  interpreted as middle click and there are some special rules for
>> specified trackpads, like corner clicks.
>
> there are some special rules for clickpads, specifically a click with a
> finger resting on one of the software-button areas will produce a right
> or middle click.
>
>> Does that mean that the other MT events are not sent to the clients? Could
>> it be possible to get the 2 fingers pinch gesture from a QML client for
>> example?
>
> not from a touchpad, not at this point. There are some rough plans but we've
> pretty much deferred them until we had the basics sorted with libinput.

Qt Quick was designed to take touch points directly and do its own
gesture interpretation.  But we know that we need to support gesture
events too, for OSX.  So it will be OK if pinching in Wayland is a
gesture event rather than two touchpoints, but we really do need to
have one or the other approach working.  It's unfortunate if a lot of
time goes by in which neither way works.  (Caveat: I've had a lot of
trouble getting a qtwayland compositor working well enough to use as
my main environment, although I'd really like to, so I'm not
up-to-date on what works and what doesn't at this moment)

Also in X11 I do not have multi-touch interaction with the trackpad on
my Thinkpad Helix.  I suppose it's because the synaptics driver is not
going to provide touch events, because it can only interpret a fixed
set of gestures.  The upside is that I can flick even in rxvt; the
downside is I can't do pinch gestures anywhere, because X11 protocol
definition is such a slow process that 7 years after the iPhone
introduced pinching, we still don't have a pinch event.  At some point
I was testing Qt Quick with the plain evdev driver with an Apple
Bluetooth touchpad, that used to provide the actual touch points.  It
was a better experience for Qt Quick and a worse one for everything
else.

We do need to have a good strategy for how this stuff is going to work
better in the future.  That's one purpose for the touch & gestures
session at the upcoming Qt Contributors Summit:
https://qt-project.org/groups/qt-contributors-summit-2014/wiki/Program
although I would be glad to delve deeper into X11 and Wayland
specifics beyond that session.  It would be good if any of you who
know the details could attend.

Flicking is a weird case because Qt Quick does its own physics: the
flicking continues after you release your finger, and there is the
bounce-back at the end.  On Apple platforms the QtQuick behavior
doesn't match the native one, so there are discussions about how to
fix that.  Are you thinking that on wayland the flicking should be
driven by extra events beyond the actual finger release, which keep
driving the UI to the end and then sending reversed events to generate
the bounce-back?  I think the main reason for having a flick gesture
at all is to enable flicking in legacy applications which were
designed to handle mouse wheel.  The trouble is that there then has to
be a mechanism to tell it where the "end" is, for non-legacy
applications which actually want to have the "bounce" or some other
end-of-flick behavior.  IMO that's an unfortunate break in
encapsulation; but if the applications alternatively do their own
flick physics, they are free to do it differently and inconsistently.
Same thing with other gestures.  It would be nice to put the gesture
and related behavioral stuff into a library, so that it's modular and
optional and can be replaced with an alternate one, and yet if the
same library is used everywhere, then it's consistent.  Putting this
stuff at too low a level (like inside the synaptics driver) tends to
mean that the gestures will be a fixed set, whereas it would be nice
to be able to invent new ones.  (Not that there is any framework which
makes it easy, yet...)  I think it's unfortunate if there is no way to
get the actual touch points.  It would be an acceptable compromise if
the shared gesture library can get them, and applications can get them
only by explicitly asking for them, and bypassing the gesture library.
 Then at least everyone knows of a couple of accessible places to do
the hacking to add new ones or tweak the existing ones, rather than
having to hack the things that are fixed for most users, such as
device drivers and compositors.

Wayland (and Qt on Wayland) should end up being more hackable than
Cocoa, and offer the same or better feature set, not limp along like
X11 has been.


More information about the wayland-devel mailing list