[RFC] weston: Sony clickpad support
Peter Hutterer
peter.hutterer at who-t.net
Sat Aug 17 00:50:02 PDT 2013
On Thu, Aug 15, 2013 at 07:59:29PM +0600, Alexander E. Patrakov wrote:
> 2013/8/15 Peter Hutterer <peter.hutterer at who-t.net>:
>
> > one of the things that should be done is to figure out _where_ features such
> > as this are going to be handled. In the compositor, the compositor's input
> > module, on the client side, ... ? I'm trying to figure out how to handle
> > this correctly, but don't have much to show here just yet.
>
> > For example, wl_pointer only has a button event, which means that a client
> > cannot differ between a tap and a button click. no doubt this should be in a
> > piece of shared code, but right now it's not quite sure what can be
> > shared where yet.
>
> FWIW, I don't have a self-consistent opinion on this, but I have a
> clickpad that needs my patch or something similar :) It may sound too
> trivial, but here are the conflicting arguments to consider.
>
> 1. Consistency.
>
> I find the situation quite similar to that with on-screen keyboards
> (or maybe input methods). I am saying that separately, because on
> GUADEC an opinion was expressed by me and confirmed by Keith Packard
> that on-screen keyboards are not necessarily input methods. Indeed, an
> on-screen keyboard converts a sequence of touches or pointer clicks
> into a series of correctly-timed key up/down events that you can
> (ideally) use to play Quake or enter text, while traditional input
> methods only produce text. OTOH Caribou with its
> long-press-to-get-accents mode can't get timings right. And here and
> now we have a mechanism that converts touchpad touches into a series
> of pointer events and "button clicks" - i.e. another case of event
> converter. It would look wrong if these two use cases for event
> converters have inconsistent designs.
on-screen keyboards have a one-to-one mapping. you press the button, you
generate a key event. you don't have to worry about speed of pointer
movement, direction of pointer movemnt, relative position of touchpoints (if
multiple are present, etc.). the complexity to get swipe, pinch and rotate
is higher than what you need for an on-screen keyboard.
> 2. Complexity.
>
> Maybe off-topic: yesterday in a shop I also saw a bluetooth keyboard
> with a built-in touchpad (from a local manufacturer, looks quite
> similar to Rapoo E9080 but has bluetooth), and bought it for use as a
> remote controller for the media center. The peculiar thing was that
> the touchpad doubles as a numpad, i.e. has painted markings on it, and
> a touch-based virtual switch that puts the touchpad either in the
> touchpad mode or in the numpad mode. This particular device does the
> necessary interpretation of touches in hardware, i.e. the kernel emits
> pointer relative events and button presses or key events without any
> special drivers. But I won't be surprised if new devices with painted
> markings around virtual keys appear on the market that will also need
> to be decoded in software.
there are such devices on the market, and there will be more. some touchpads
already have a "toggle on/off" button that must be handled in software.
I think some graphics tablets provide software buttons as well.
> So the software that converts touches on
> specially-marked surfaces into their device-specific intended meaning
> will likely get more complex and weird in the future. Just like input
> methods.
interestingly enough, i don't think this particular use-case is much
different from the 'normal' desktop use-cases. once you see the new area on
the touchpad as auxiliary screen (that the pointer may not be able to
reach) you'll find that the interaction methods will be largely the same -
buttons to click, areas to move within, etc.
> 3. Compatibility.
>
> There is already a lot of software (e.g. weston-terninal and all gtk+
> apps) that expects a pointer-like interface when operated with a
> touchpad. We can't just break it.
this argument is the reason why we have a lot of options in the xorg input
modules. and with very mixed success. putting kinetic scrolling into the
synaptics driver was a bad idea. yes, it works everywhere, including xterm,
but it's hard for clients to detect and use properly.
two-finger scrolling suffers largely from the same problem, because we
convert to button events, it's hard to do proper kinetics in the client.
in fact - that's why we introduced smooth scrolling, despite the need for
clients to update to handle it. now clients can do it properly, but only
some do. so we're inconsistent.
there's levels of breakage, one being "this won't work anymore" and the
other one being "this won't work if not actively supported". wayland has an
advantage over X because it says "if you don't support touch, you don't get
touch". In X, we emulate a pointer because we can't break the existing
use-cases. look at the code, the various corner cases, the cases where we
found we can't do the right thing, you'll find some interesting things.
short summary: you can't win, but you can pick how badly you lose.
> 4. The need to show a pointer.
>
> There is a notion of the global pointer position that needs to be
> maintained in the compositor that is used with a touchpad. Due to the
> need to show this pointer, one can't entirely punt touchpad support to
> applications. And we can't "punt only gestures and quirks to
> applications", as on Sony touchpads there is an area where finger
> movements should be ignored for the purpose of moving the pointer, and
> the pointer is a thing that belongs to a compositor.
but it's not really supposed to be ignored, is it? I don't have that sony
touchpad but on the lenovo x220 there's a similar clickpad with painted on
buttons. And I regularly use the button area for moving the pointer (because
the bloody touchpad is so small). the only time I want the buttons to work
as such is if I actually physically click. and even then, I might do a move
to button area, press, move around to drag.
don't get me wrong, I'm not saying that your approach is wrong. what I am
arguing for is that we need some clear plan of how we are going to support
all the quirky features that we need to eventually support.
I am trying to at least build a list of these, but it'll take me a few more
days I think.
> 5. Artificial limitations.
>
> Like it or not, there will be software in the future that wants to get
> the exact finger positions and not the interpreted events. E.g.
> software that is used to configure and calibrate the touchpad, games,
> or software that wants to implement custom app-specific gestures.
> That's exactly like games that want raw keypresses and not text
> constructed by the input method.
>
> However, in my opinion, people want their hardware supported now, and
> will not wait for an architecturally-correct solution (that can be
> added later). For now, anyway, a touchpad is just something that sends
> wl_pointer events, and if it continues to be that way, applications
> won't need to be changed.
IMO this is the same argument as 3 and it has validity. until you realise
that "be backwards compatible until forever" is actually a long long time.
again, see pointer-emulation for multitouch in X. Once you really commit to
something, you can't easily back out of it anymore.
and again, this is not a criticism of your code (which tbh I haven't looked
at in detail). it's more a "hold your horses" to avoid pushing stuff where
it will get difficult to support. and I am sorry that right now all I have
is a stopping argument, I will come up with something more useful.
Cheers,
Peter
More information about the wayland-devel
mailing list