[Accessibility] Accessibility features in Wayland - feedback request

Peter Hutterer peter.hutterer at who-t.net
Mon Feb 11 00:39:27 UTC 2019


On Sat, Feb 09, 2019 at 12:05:06PM +0100, Samuel Thibault wrote:
> Hello,
> 
> Peter Hutterer, le ven. 31 oct. 2014 10:24:37 +1000, a ecrit:
> > As you may be aware, the input stack in Wayland is quite different to the
> > one in X. most notably we're working on libinput as a shared input stack to
> > be used between compositor. Most of the baseline is done, one of the things
> > we're now looking into is accessibility, but this is where we lack feedback
> > to decide what needs to be done and how.
> > 
> > If you or someone you know require specific accessibility features, please
> > let me know, I'd like to discuss with you why you need it, how that need is
> > currently being addressed and whether we can improve on that. I'm
> > specifically looking to get a grasp on the various requirements within the
> > input stack, so anything from braille input devices to debouncing keys to
> > button locking.
> 
> Unfortunately this doesn't seem to have gotten answers at the time :/
> 
> All I know of I had written on
> 
> https://www.freedesktop.org/wiki/Accessibility/Input/?updated
> 
> AIUI, putting this generically, we need accessibility tools to be able
> to
> 
> - steal some input events, either globally or for a given device.
> 
>   For instance, Orca wants to globally catch presses on the capslock key
>   on any keyboard, to use it as an "orca key" for its own shortcuts. It
>   also needs to be able to synthesize capslock presses to get capslock
>   behavior when double-pressing capslock.

This largely seems in-line with the general "we need hotkeys to work" in the
wayland stack.
 
>   Another instance is using a given touchpad as a positioning device,
>   which requires consuming all input events from it, and getting
>   absolute coordinates.

can you expand on this bit please? 

At some point I had a libinput branch that enabled the touchpad to be used
as direct input device for the drawing of e.g. chinese characters. There are
quite a few corner-cases with that approach though that receivers would have
to deal with, you almost end up with a mini-libinput in the application.


> - synthesize input events.
> 
>   As mentioned above for Orca, but also for various kinds of Assistive
>   Technologies which provide alternative ways of typing on the keyboard,
>   of moving the mouse etc.
> 
> Implementing these kinds of features within libinput itself could make
> sense, but in practice I don't think people will manage to do it (I
> guess libinput is C only? Most AT tools nowadays are pythonic), while an
> interface to steal/inject events would allow to development them on the
> side.

libinput is written in C, correct. However, the primary reason for libinput
is as a hw abstraction for compositors and in Wayland the compositor has
full control over input events. So any injection interface is IMO better
implemented as some standardised DBus interface that the compositor exposes.
this gives the compositor control over the interfaces allowed (and the
context), etc. Much better than trying to figure out a back channel through
libinput.

Cheers,
   Peter


More information about the accessibility mailing list