Some of my thoughts on input for wayland
martyj19 at comcast.net
Mon Jan 24 02:36:46 PST 2011
On 01/23/2011 09:03 PM, Chase Douglas wrote:
> Hi all,
> I haven't been involved in wayland development at all yet :), but I have
> been working on XInput 2.1 multitouch additions and gesture work in
> Ubuntu. I have a few thoughts on how a new input system for wayland
> might work.
> To go along with that, I have no idea if these ideas have been discussed
> before or not, nor whether the wayland architecture would allow them.
> These are just some generic thoughts I've had on an input service
> First I'd like to address what I think we can learn from X. X11 has a
> core protocol and an XInput extension with two major versions. To
> develop additions to the input system in X you must meet two obligations:
> 1. Develop alongside all the other work going on in X
> 2. Be backwards compatible with the previous input systems
> 3. Be integrated into the same display server source code
> I think we could take a different approach with Wayland: separate input
> from display. What does the input system need from the rest of X?
> Nothing really other than window regions and hierarchy on the screen.
> My proposal would be to create a new input system project (inland?) and
> define a standard of access between wayland and the new input system.
> The access could be provided through shared memory or some other low
> latency IPC. This would allow mixing and matching of display servers and
> input servers, and separate out the development practices and timelines
> for greater flexibility.
> Now, this may seem crazy at first. Once a new input system is defined,
> wouldn't we want to standardise upon it and not change it? Hopefully
> yes. But think of wayland itself. It was conceived mostly due to issues
> with X graphics. Imagine if 10 years from now the graphics stack needed
> another rewrite, but the input stack was perfectly fine. We could
> transition to a new graphics server without modifying any application
> input code if the input system was easily transferable.
> There's another issue I've noticed as we've gone through XI 2.1 and
> gesture work: We are building multiple serial event delivery mechanisms.
> Here's an example:
> 1. Touchscreen nput comes into X from Linux evdev interface
> 2. XI 2.1 touch events are generated
> 3. uTouch gesture recognizer receives events through passive grab on
> root window
> 4. Gesture recognizer recognizes a gesture
> 5. No client is subscribed for the gesture
> 6. Gesture recognizer relinquishes touch grabs
> 7. Touches propagate through X server
> 8. No other clients are found for touches
> 9. One of the touches is turned into pointer emulation
> 10. Pointer events propagate through X server once more
> We see there are three event delivery mechanisms: gestures, touches, and
> pointer emulation. In each case, we are potentially transforming the raw
> input data into a different format and looking for clients who selected
> for appropriate event types. There's also a defined precedence ordering:
> gestures -> touches -> pointer emulation.
> One could argue whether this is a proper precedence ordering or not, but
> the point is that there is a precedence ordering. In fact, I wonder if a
> better ordering might be:
> gesture grabs -> touch grabs -> pointer grabs -> gesture selections ->
> touch selections -> pointer selections.
> As input devices are advanced into true 3 dimensional space, we may find
> a need for even more intricate input service mechanisms. A more future
> proof model may involve the ability to dynamically slot in input systems
> as plugins. In this way, we might also be able to deprecate older input
> protocols over time.
> Thanks for listening to me on my soapbox! I look forward to your thoughts.
> -- Chase
> wayland-devel mailing list
> wayland-devel at lists.freedesktop.org
You don't appear to yet be considering the keyboard side of things, which is equally important and since they interact due to accessibility, should be designed together. I would point out that input methods appear to be a lot like pointer gestures, in that several user actions map to one thing, and could probably be handled very similarly.
I don't think Kristian has quite the same view of grabs that you do, but I don't want to misrepresent his thinking on it, so he can talk about that himself if he chooses.
I do hope we can do this in a way that doesn't result in yet another overhead process.
I would support any out-of-the-box thinking applied to a clean redesign of anything. I would point out a few input-side areas that are on my list for what is needed for Wayland to totally replace X.
Something equivalent to xorg.conf and xorg.conf.d for devices that need some sort of initialization quirk or blacklisting (Comes up constantly on xorg mailing list - must address)
Input methods replacement: that's half of your user base
Carry forward AccessX mechanisms for keyboard/mouse substitutions
Something equivalent to passive grabs so that a process can own a particular action like Volume Up and be assured that there aren't five different processes all trying to control the volume.
I continue to hope that there is a path for the "legacy KeySym" encodings to be replaced by their Unicode equivalents along with some encoding of the function key space.
This might also be an opportune moment to point out that since New Year's Day I have been experimenting with a rewrite of the keyboard compiler, the way I think someone who was the chief designer/implementor of five heavily used production quality compilers for one of the major manufacturers would design it. Right now it is probably about half done. This, if brought to fruition, would allow some streamlining and/or new approaches in the way keyboard layouts are specified and loaded. (Not to at all diminish the good work that Dan Nicholson and others have done on xkbcomp recently.)
More information about the wayland-devel