[Accessibility] Accessibility features in Wayland - feedback request

Peter Hutterer peter.hutterer at who-t.net
Mon Feb 11 01:29:09 UTC 2019


On Mon, Feb 11, 2019 at 01:57:16AM +0100, Samuel Thibault wrote:
> Peter Hutterer, le lun. 11 févr. 2019 10:39:27 +1000, a ecrit:
> > On Sat, Feb 09, 2019 at 12:05:06PM +0100, Samuel Thibault wrote:
> > >   Another instance is using a given touchpad as a positioning device,
> > >   which requires consuming all input events from it, and getting
> > >   absolute coordinates.
> > 
> > can you expand on this bit please? 
> 
> In the Hypra company (hypra.fr) we are working on a project of using
> just a couple of braille cells to be read by one hand, and to use the
> other hand to point with a touchpad which part of the screen should be
> shown on the cells. To be able to find one's way, that positioning needs
> to be absolute, just like with tablets (but much more widely available
> than a tablet). Put another way, moving the finger at the top left
> corner of the touchpad would make the cells show the first menu of the
> window, which is at the top left part of the screen.

thanks, this sounds like quite an interesting project. but I do need to
point to this one here:
https://wayland.freedesktop.org/libinput/doc/latest/what-is-libinput.html
specificially the bit "libinput is boring. It does not intend to break new
grounds on how devices are handled." 

For accessibility issues the rules are harder to define than for a random
project but here too we are going to run into some issues.

This is a nontrival technical solution that requires updates to libinput,
the compositors and the client stacks so we'll need to get sign-in by all of
them before it's worth getting started here. This never really happened with
the aforementioned chinese character input bits (because everyone is ETIME),
so the effort stalled here. Unfortunately, the same is likely to happen here
unless you have time allocated to work on all these. and lots of it...
 
> > > - synthesize input events.
> > > 
> > >   As mentioned above for Orca, but also for various kinds of Assistive
> > >   Technologies which provide alternative ways of typing on the keyboard,
> > >   of moving the mouse etc.
> > > 
> > > Implementing these kinds of features within libinput itself could make
> > > sense, but in practice I don't think people will manage to do it (I
> > > guess libinput is C only? Most AT tools nowadays are pythonic), while an
> > > interface to steal/inject events would allow to development them on the
> > > side.
> > 
> > libinput is written in C, correct. However, the primary reason for libinput
> > is as a hw abstraction for compositors and in Wayland the compositor has
> > full control over input events. So any injection interface is IMO better
> > implemented as some standardised DBus interface that the compositor exposes.
> > this gives the compositor control over the interfaces allowed (and the
> > context), etc. Much better than trying to figure out a back channel through
> > libinput.
> 
> But then it'd need to be implemented by all compositors, right?

yes, but the effort do to that is likely less than putting it into libinput.
especially in the long term when we'll start seeing the headaches caused by
having it in the wrong part of the stack.

Cheers,
   Peter


More information about the accessibility mailing list