Wayland compositors and accessibility
samuel.thibault at ens-lyon.org
Wed Apr 3 13:00:09 UTC 2019
Olivier Fourdan, le mer. 03 avril 2019 14:39:29 +0200, a ecrit:
> > What we currently have in Wayland is support for AccessX, directly
> > implemented in mutter. The rest of accessibility features are currently
> > mostly broken.
> Well, instead of "mostly broken", I'd say it's a WIP on the GNOME
> side. To recap the state of the accessibility features in GNOME:
Some of these cover what I mentioned indeed, but that does not cover the
Orca needs for instance, and is GNOME-only. I'd really not push for a
GNOME-only solution that would only leave disabled people with one
choice of desktop, and not be able to use other people's computer which
may not be running GNOME.
> * For at-spi on Wayland, there is no global coordinates and no plan
> to add them. So for being able to generate pointer events without
> global coordinates and keyboard events without XTest for a replacement
> of dogtail which would work with GNOME on Wayland, there is ponytail:
I am aware of this, but I thought this was mostly considered as a hack
only for testing and not for production use?
Apparently RecordWindow and Introspect are mutter/gnome-shell-specific,
are they supposed to be implemented by other compositors as well?
> > We could think
> > of moving the implementation to a shared library that compositors
> > would use, with the same principle as libinput. Such library could
> > implement all accessibility features that a compositor should implement
> > itself.
> Yet, I think this is quite different from what libinput does.
Sure, the kind of processing is different. I just mean that such a
library could be shared by compositors just like libinput is shared by
> These features are very deeply rooted in the event loop of the
> compositor, in the case of mutter, this is clutter.
> Reason for this is because it need to be able to delay, replace or
> inject both pointer (mousekeys) and keyboard events (slowkeys,
> stickykeys, bouncekeys).
Sure, that's what I mentioned above.
> It also interfere with xkb state for modifiers (stickykeys). I reckon
> moving the logic out of the compositor is certainly doable, but would
> not necessarily make the code much simpler for other compositors...
It seems to me that the current AccessX code is already quite involved
compared to the interface it would require. And there is more that we
would like to add for other accessibility features.
> > It could also provide interfaces for plugging accessibility
> > features implemented in separate processus (e.g. screen reader). To
> > avoid keyloggers and such, such interface needs to be available only to
> > trusted accessibility processes.
> That does not need to be Wayland protocols though, these could be DBUS.
> But I guess we need to define what a "trusted accessibility processes"
> is and how the compositor can enforce that.
Yes. AIUI there were already some discussions about this on
> > About the screen reader shortcuts, one issue we have with the current
> > implementation is that for each keyboard event the toolkit has to wait
> > for the screen reader to tell whether it ate the event or not. That
> > makes all keyboard events get delayed. This should probably be replaced
> > by a registration mechanism: the screen reader tells which shortcuts it
> > is interested in, and only the events for them are captured and provided
> > to the screen reader, asynchronously.
> > Opinions on the whole thing?
> If the screenreader issue is just about keyboard shortcuts, then a
> DBUS interface would be doable.
There are also other use cases mentioned on the wiki page: e.g. virtual
keyboard needing to inject keypresses, screen reader needing key events
notification and mouse notification.
But really, my main concern is that a mutter/gnome-shell-only solution
is really not a proper solution, we need to define something that other
compositors can easily implement too.
More information about the wayland-devel