[PATCH weston 4/5] evdev: Improve touchpad support and add motion filters

Jonas Ådahl jadahl at gmail.com
Tue May 15 02:00:43 PDT 2012


On Mon, May 14, 2012 at 7:31 PM, Chase Douglas
<chase.douglas at canonical.com> wrote:
> On 05/09/2012 02:31 PM, Jonas Ådahl wrote:
>> Touchpad related code has been rewritten and moved to its own file
>> accessed by evdev via the dispatch interface.
>>
>> The various functionality implemented are anti-flicker (don't jumping
>> around), smoother motions, touch detection, pointer acceleration and
>> some more.
>>
>> Pointer acceleration is implemented as one generic part, and one touch
>> specific part (a profile).
>>
>> Some ideas and magic numbers comes from xserver and
>> xf86-input-synaptics.
>
> Can we move the acceleration code to a library? The reason I ask is that
> we will want acceleration to feel the same whether the display server or
> a client wants to move objects.
>
> In Unity, we have a three finger gesture to move a window around.
> Unfortunately, we can't really replicate the X server's motion because
> of how it is completely intertwined in the X server implementation.
> Thus, our window motion feels completely different to the X server's
> cursor motion. Benjamin Otte has had similar frustrations as he has been
> trying to implement gestures in GTK+.
>
> My suggestion would be to create a separate library that implements the
> pointer acceleration. It would be configurable by querying settings from
> a system location. The user of the library would merely pass in raw
> motion locations and timestamps, and maybe other data like pressure if
> needed, and the library would spit out accelerated locations.
>
> -- Chase

As far as I know moving windows is not done by the client directly.
The client itself has no idea where on the screen it is. What the
client does is initiate the moving, then it's a job of the shell to do
the actual moving by "attaching" it to an input device (future
wl_pointer?) moving with its coordinates. This way there is no point
in putting any pointer acceleration code as a separate library. What
is it that's actually required to implement these gestures? Wouldn't
it be better to make the input system of the compositor to be
versatile and powerful enough so one won't need to do things like
emulate pointer acceleration client side?

Jonas


More information about the wayland-devel mailing list