[Accessibility] Accessibility features in Wayland - feedback request

Peter Hutterer peter.hutterer at who-t.net
Tue Nov 4 16:25:32 PST 2014


Hi Michael,

On Tue, Nov 04, 2014 at 05:42:04PM +0600, Michael Pozhidaev wrote:
> Soory for the delay please! I am very interested in future accessibility
> features that could be supported by Wayland but can I learn more about
> where the process goes to for now?

That's a bit hard to answer. The general prediction of the future is that
we'll get libinput ready. That is then used by the various wayland
compositors while toolkits are independently updated to use wayland
natively. Over time, the desktop will switch over to use wayland.

Much of the accessibility requirements are in _where_ to put features and to
remember to put things in before we declare APIs as finished.
fwiw, libinput development happens on the wayland-devel list, feel free to
chime in there with feature requests, or file a bug against libinput in
the freedesktop.org bugzilla (product wayland, component libinput):
https://bugs.freedesktop.org/enter_bug.cgi?product=Wayland

> Just for the beginning I could say that usually it is very good to imitate
> all types of input events at various levels. In addition it could be
> very important to listen all input events that come.

right, so to give you a quick overview of the stack: libinput sits below the
compositor and doesn't talk wayland at all, that's up to the compositor
itself and the wayland clients (toolkits). There's an high-level outline
here:
http://who-t.blogspot.com.au/2014/09/libinput-common-input-stack-for-wayland.html

Why this matters for your question:
emulating input events from other events _may_ need to be added to libinput
(similar to touchpad tapping or touchpad software buttons). Other than that
it's a bit too generic for now, it's hard to figure out an API based on
"emulate everything" :)

Listening to all input events would be a compositor requirement, it may need
some sort of extra protocol. Again, it's quite a generic requirement, some
specific use-cases would make it easier to figure out what exactly is
required.
 
> For example, for the accessibility work that I am doing now it  is good to
> translate touchpad movements not into mouse pointer position changing but into
> imitation of arrow keys pressing. 

Can you expand on the requirements and motivations here?
e.g. from a low-level POV converting touchpad events to key events isn't
ideal - it's hard to do generically, we don't necessarily know the keyboard
layout, etc. OTOH converting movements into semantics gestures is doable,
and can be handled much more flexibly provided the rest of the stack is in
place.
So I'd like to know more about the details of what you're trying to do and
why so w can figure out the best technical solution here.

thanks

Cheers,
   Peter

> So, it is very interesting to learn more what level of flexibility
> is already implied and  think a bit what could be useful as well. 
> 
> Thank you a lot for this message! Very interesting! :))
> 
> Peter Hutterer writes:
> 
> > Hi list,
> >
> > As you may be aware, the input stack in Wayland is quite different to the
> > one in X. most notably we're working on libinput as a shared input stack to
> > be used between compositor. Most of the baseline is done, one of the things
> > we're now looking into is accessibility, but this is where we lack feedback
> > to decide what needs to be done and how.
> >
> > If you or someone you know require specific accessibility features, please
> > let me know, I'd like to discuss with you why you need it, how that need is
> > currently being addressed and whether we can improve on that. I'm
> > specifically looking to get a grasp on the various requirements within the
> > input stack, so anything from braille input devices to debouncing keys to
> > button locking.
> >
> > Happy to have the discussion off-list if you prefer to stay more anonymous.
> >
> > Cheers,
> >    Peter


More information about the accessibility mailing list