[PATCH weston 4/5] evdev: Improve touchpad support and add motion filters
chase.douglas at canonical.com
Tue May 15 10:58:05 PDT 2012
On 05/15/2012 10:43 AM, Bill Spitzak wrote:
> It really sounds like the compositor should do this. IMHO it should
> produce *lots* of input events, so these gestures can be recognized.
> It would produce scroll events or whatever at the correct speed (or
> with a 24_8 increment on each) so that the acceleration is exactly the
> same if the client obeys these. Note I don't know much about gestures
> so don't take this as an actual suggestion, just an indication of what
> I am thinking would work.
> The main rule however is that it is trivial for a client to filter
> these events out and get only the raw events. This will allow them to
> do their own gestures or any other input processing. The compositor
> must not "eat" events.
> Even for very old input devices Wayland should produce useful events:
> repeat events for keyboard keys, repeat for mouse buttons (so holding
> down a button widget repeats at the same rate for all apps), a "hover"
> event (or maybe several) so all clients agree when to popup a tooltip.
> An indication on mouse movement and button release as to whether it is
> a "click" or a "drag" so that all clients agree on mouse-shake
> removal. A clear indication of double-click on the second mouse-down
> so all clients agree on double-click speed.
> And there is text input methods: keys would be marked as being eaten
> by a text input method, so the client could ignore them, and then text
> produced by the text input method would be sent as UTF-8 strings,
> along with an indicator if this replaces the previous text (to allow
> composition in-place without the horrible overlays that X tried to
> use). (a problem is that clients need to control whether the input
> method is active and to be able to cancel it for any key, though I
> hope these can be async messages).
> This would make writing simple clients a lot easier, and would make
> all clients agree to responses.
> The problem is that it would require additions to the Wayland protocol
> as new ideas in input processing are invented. However I think as long
> as raw events are available clients can do these themselves until the
> ideas are worked out and it is added to Wayland.
Moving complex input behavior inside the compositor sounds good in
theory. The problem is that there really is no consensus in any window
server implementation, open or closed source, on how to provide a
complete input stack. We're still figuring out the best ways to
implement 2D gestures on a planar surface. Someone is going to want to
send Kinect input events, which may require completely different ways of
interpreting input events. It's just not feasible to be forward-looking
enough to cater for all future needs within a stable window server ABI.
Thus, we should push as much as possible from the window server to the
client. Basically, for touch and mouse devices this means anything that
does not involve cursor motion, since the client telling the server when
and where to move the cursor would be bad for latency and performance.
The window server should simply demultiplex "raw" events, and then we
provide standard libraries for clients to use for handling scrolling,
To give an example of how keeping this stuff in the server can be
problematic, try using momentum scrolling in X.org or OS X Quartz. In X,
if you scroll and alt-tab, the new window will receive the scroll
events. In OS X, if you scroll with a magic mouse and then move the
pointer, the scrolling will be sent to the windows you move over. With a
better protocol or hacks we maybe could fix X. OS X could probably be
fixed with similar hacks. But this stuff really should be handled in the
client toolkit using a standard implementation.
More information about the wayland-devel