[RFC] weston: Sony clickpad support

Alexander E. Patrakov patrakov at gmail.com
Sat Aug 17 06:24:27 PDT 2013


2013/8/17 Peter Hutterer <peter.hutterer at who-t.net>:

<all points where we agree are snipped>

> on-screen keyboards have a one-to-one mapping. you press the button, you
> generate a key event. you don't have to worry about speed of pointer
> movement, direction of pointer movemnt, relative position of touchpoints (if
> multiple are present, etc.). the complexity to get swipe, pinch and rotate
> is higher than what you need for an on-screen keyboard.

<off-topic for this thread, also please note that this section is
strictly about UI and applies in theory to any windowing system>

Indeed, there is a class of on-screen keyboards (let's call them
simple on-screen keyboards) where there is a one-to-one mapping
between touch/release and keydown/keyup events, with exactly the same
timing and with possibility of several keys being pressed at the same
time. They are useful, they (in theory) can be used to enter text
directly, to control other input methods that expect keydown/keyup
events and produce text, or to play quake. However, there is another
class of programs that position themselves as on-screen keyboards but
that don't have this one-to-one mapping. I think that we have to think
about them too, and possibly separate them from the simple ones - and
it makes sense because thay can't be used to play quake.

1. Caribou (part of GNOME), if you touch "e", won't immediately
generate the "e" letter. Instead, it will start a short timer. If the
touch is released before the timer expires, it will generate the "e"
letter. Otherwise, it will show a popup window with various accented
forms of "e", ignore the release event, and accept the
touch-and-release in the popup window to mean one of these accented
forms.

2. MessagEase keyboard (that you can try for free on your Android
phone) has nine main screen areas that react to touches and swipes.
E.g., the central area will produce an "o" when touched and released,
but "u" if swiped from bottom up and "b" if swiped from left to right.

Of course, neither Caribou nor MessagEase can be used to play quake,
because of broken timing.

So we need to take into account both use cases - producing
keyup/keydown events in response to touches for "simple" on-screen
keyboards and producing text for complex ones.

<end of off-topic>

> interestingly enough, i don't think this particular use-case is much
> different from the 'normal' desktop use-cases. once you see the new area on
> the touchpad as auxiliary screen  (that the pointer may not be able to
> reach) you'll find that the interaction methods will be largely the same -
> buttons to click, areas to move within, etc.

Yes, this is an interesting observation that can simplify writing the
"virtual mouse" software for touchpads using existing toolkits.

>
>> 3. Compatibility.
>>
>> There is already a lot of software (e.g. weston-terninal and all gtk+
>> apps) that expects a pointer-like interface when operated with a
>> touchpad. We can't just break it.

> there's levels of breakage, one being "this won't work anymore" and the
> other one being "this won't work if not actively supported". wayland has an
> advantage over X because it says "if you don't support touch, you don't get
> touch". In X, we emulate a pointer because we can't break the existing
> use-cases. look at the code, the various corner cases, the cases where we
> found we can't do the right thing, you'll find some interesting things.
>
> short summary: you can't win, but you can pick how badly you lose.

For touchscreens, I agree with the above. For touchpads - uh, maybe,
maybe not, depends on how you answer to the need to show the pointer
in the compositor (below). We already have a pointer interface (and it
is definitely used in the weston compositor itself), not a clean
start.

>
>> 4. The need to show a pointer.
>>
>> There is a notion of the global pointer position that needs to be
>> maintained in the compositor that is used with a touchpad. Due to the
>> need to show this pointer, one can't entirely punt touchpad support to
>> applications. And we can't "punt only gestures and quirks to
>> applications", as on Sony touchpads there is an area where finger
>> movements should be ignored for the purpose of moving the pointer, and
>> the pointer is a thing that belongs to a compositor.
>
> but it's not really supposed to be ignored, is it? I don't have that sony
> touchpad but on the lenovo x220 there's a similar clickpad with painted on
> buttons. And I regularly use the button area for moving the pointer (because
> the bloody touchpad is so small). the only time I want the buttons to work
> as such is if I actually physically click. and even then, I might do a move
> to button area, press, move around to drag.

The Windows driver, at its default settings, does ignore finger
movements in the button area on my laptop. And that's a good thing for
me, as without it pixel-perfect clicks are not possible (I move the
pointer to the desired point, put another finger in the button area,
press it, and that causes the new contact area to enlarge and shift
its center, thus resulting in a misplaced click unless movements in
the button area are ignored). Your arguments for the opposite are also
valid, that's why a user-configurable preference is a must. The real
question is about the defaults.

I think that, where possible, the default behaviour of native-platform
(Windows, Chrome OS, Mac OS X) touchpad drivers should be copied, so
that people who dual-boot don't have to learn two interaction models
(e.g. "by default, windows ignores movement in this area, linux
doesn't" is always a bug in linux, even if the user is not happy with
the windows default). And this is where the X11 synaptics driver
fails. Just for the record, a big "please": don't use its defaults
(which are not sufficiently device-specific anyway) in discussions of
what should be done by default.

I won't be surprised if the Windows driver for your touchpad also
ignores movements in button areas - please retest if that's not too
much burden for you. If it doesn't ignore movements in the button
area, then we need yet another interaction model in evdev-touchpad.c
that matches your default, please notify me and send
/proc/bus/input/devices if it is indeed the case.

> don't get me wrong, I'm not saying that your approach is wrong. what I am
> arguing for is that we need some clear plan of how we are going to support
> all the quirky features that we need to eventually support.
> I am trying to at least build a list of these, but it'll take me a few more
> days I think.

Yes, there surely is a need to get a clear plan how normal touchpads,
weird touchpads and on-screen keyboards are going to be supported in
the future. I have no problem carrying the patches on my laptop.
However, also I think that there is no reason to delay support for
Sony clickpads only due to the need of this plan, even if my approach
is wrong (but please do look at the code). We can always move the code
or reimplement the functionality elsewhere later. I will have no
problem with my code being ripped out later due to the new plan.

>From any wayland application viewpoint, now, any touchpad is just a
mouse. My patches don't change this and don't introduce new
interfaces, so there is no way that an application can get a
dependency on them, and no need to support them forever. If any
application would have to be rewritten due to the new plan, then this
is equally true both with and without my patches. And the amount of
work needed to implement the new plan is the same: write the plan,
implement the desired touches-to-everything-else conversion logic in
the new place, rip out the entire evdev-touchpad.c file.

-- 
Alexander E. Patrakov


More information about the wayland-devel mailing list