[PATCH weston 0/5] Touchpad support

Bill Spitzak spitzak at gmail.com
Thu May 10 12:20:56 PDT 2012


Though I usually want the client in charge of everything, I think 
Wayland can, and should, do a lot of input processing.

The only requirement is that any processing must be done in such a way 
that the client can ignore it, get the raw input events, and do it's own.

The primary reason is because this is often subject to a lot of user 
configuration and it is really annoying if simple and obvious 
configurations don't work for some programs because they have not been 
updated.

A secondary reason is so that instead of every client setting lots of 
timers, only the compositor does.

A simple example would be keyboard repeat. Wayland could just pass 
through the key-down and key-up events, and clients are expected to fake 
key repeats if the key-up does not come in sufficient time. However 
Wayland sends key-repeat events. Clients can identify these exactly and 
ignore them, in the case it really wants the key-down and key-up.

Even something as simple as key repeat requires a lot of configuration. 
There is a delay before the first repeat, then a speed. You may even 
want the speed to change after some time. There may be keys that should 
not repeat. Imagine the annoyance when some programs obey only a subset 
of these configurations. And also the annoyance that simple demo wayland 
programs "don't work" because handling this is too complex to put in a demo.

I don't know enough about gesture support, but it does seem like the 
client could get the touch and movement events, and then eventually get 
a "this is gesture x" event. The client can ignore that if it wants to 
interpret touch itself. Or it can use it, and ignore the remaining touch 
events.

I think Wayland can even do I18N input methods. The client will still 
get all the keystrokes, but it can ignore them and instead await input 
method events that indicate the text that has been entered (it could 
also replace the previous result, allowing incremental input methods). 
The client can stop input method popups my sending a "I used this key 
event" message (which would also be useful so that clients can override 
global shortcuts, instead of the X/Windows method of requiring the user 
to hold down multiple shift keys for global shortcuts).

> On 10 May 2012 17:45, Christopher James Halse Rogers
> <christopher.halse.rogers at canonical.com> wrote:
>> Is Weston going to essentially fold in all the interesting bits from all
>> the input DDXs to the core (and require this to be duplicated in all
>> other compositors), or should these conceptually be modules, eventually
>> crystallising as an input-module ABI compositors could share?
> 
> Kinda.  You really want to share stuff like touchpad processing (like
> how xf86-input-synaptics is a generic evdev processor for touchpads,
> but not half as terrible), but a lot of it involves gesture processing
> and similar, to the point where you'd almost have to build another
> input API.  And that's not really something I want to do.
> 
> Food for thought though.  I'd definitely like to see the input stuff
> in Weston kept as separate as we can make it without bending over
> backwards and causing mass annoyance to everyone, to see what falls
> out.  The long-term goal of sharing this kind of thing with other
> compositors (and even X) is definitely worthwhile.
> 
> Cheers,
> Daniel
> _______________________________________________
> wayland-devel mailing list
> wayland-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel


More information about the wayland-devel mailing list