[PATCH weston 4/5] evdev: Improve touchpad support and add motion filters

Chase Douglas chase.douglas at canonical.com
Tue May 15 08:20:47 PDT 2012


On 05/15/2012 02:00 AM, Jonas Ådahl wrote:
> On Mon, May 14, 2012 at 7:31 PM, Chase Douglas
> <chase.douglas at canonical.com> wrote:
>> On 05/09/2012 02:31 PM, Jonas Ådahl wrote:
>>> Touchpad related code has been rewritten and moved to its own file
>>> accessed by evdev via the dispatch interface.
>>>
>>> The various functionality implemented are anti-flicker (don't jumping
>>> around), smoother motions, touch detection, pointer acceleration and
>>> some more.
>>>
>>> Pointer acceleration is implemented as one generic part, and one touch
>>> specific part (a profile).
>>>
>>> Some ideas and magic numbers comes from xserver and
>>> xf86-input-synaptics.
>>
>> Can we move the acceleration code to a library? The reason I ask is that
>> we will want acceleration to feel the same whether the display server or
>> a client wants to move objects.
>>
>> In Unity, we have a three finger gesture to move a window around.
>> Unfortunately, we can't really replicate the X server's motion because
>> of how it is completely intertwined in the X server implementation.
>> Thus, our window motion feels completely different to the X server's
>> cursor motion. Benjamin Otte has had similar frustrations as he has been
>> trying to implement gestures in GTK+.
>>
>> My suggestion would be to create a separate library that implements the
>> pointer acceleration. It would be configurable by querying settings from
>> a system location. The user of the library would merely pass in raw
>> motion locations and timestamps, and maybe other data like pressure if
>> needed, and the library would spit out accelerated locations.
>>
>> -- Chase
> 
> As far as I know moving windows is not done by the client directly.
> The client itself has no idea where on the screen it is. What the
> client does is initiate the moving, then it's a job of the shell to do
> the actual moving by "attaching" it to an input device (future
> wl_pointer?) moving with its coordinates. This way there is no point
> in putting any pointer acceleration code as a separate library. What
> is it that's actually required to implement these gestures? Wouldn't
> it be better to make the input system of the compositor to be
> versatile and powerful enough so one won't need to do things like
> emulate pointer acceleration client side?

Moving windows around was just an example. This could be handled within
the compositor, which *may* have easy access to the acceleration routines.

The real issue is when you want to have object manipulation within an
application. Let's take smooth scrolling using a trackpad as an example.
One approach would be to use the same acceleration for scrolling as for
cursor motion. Scrolling should be performed by the client
application/toolkit, so the client needs to have access to the
acceleration algorithm.

Further, there's a good argument to be made for splitting it out since
there will not be one single "Wayland" compositor. Why not provide a
library so that all compositors can benefit without reinventing the wheel?

-- Chase


More information about the wayland-devel mailing list