Wayland compositors and accessibility
samuel.thibault at ens-lyon.org
Wed Apr 3 11:57:28 UTC 2019
I would like to open a discussion on accessibility features on Wayland.
Several things that happened to work on Xorg (sometimes rather by luck)
do not work any more on Wayland, so we need to find proper ways to
implement them on Wayland.
The needs are mostly on the input side. To avoid an XY problem I have
gathered a list of the *end-use* cases on
as well as ideas of how they could be implemented.
Basically, the technical requirements we seem to end up with is:
- some support (such as AccessX) needs to capture some input events
and emit other input events.
- some support (e.g. key feedback) only needs to get notification of
input events (can be asynchronous), both actual keypresses and
composition / input method results.
- some support (e.g. virtual keyboard) only needs to be able to emit
- some support (screen reader shortcuts) needs to catch pressure of a
set of keyboard shortcuts.
Most of these were previously implemented on Xorg by just snooping input
with XEvIE and synthesizing with XTest, and when XEvIE was removed the
snooping was integrated into toolkits themselves. These interfaces are
frowned upon at least for security reasons, but also some performance
Some of the features can be implemented in the compositor (e.g. AccessX
is already implemented that way), some of these however should probably
not (e.g. a screen reader).
What we currently have in Wayland is support for AccessX, directly
implemented in mutter. The rest of accessibility features are currently
Thinking of the implementation of AccessX support, it is currently
only within mutter, and not in other compositors, which is no good
for having accessibility supported on all systems. We could think
of moving the implementation to a shared library that compositors
would use, with the same principle as libinput. Such library could
implement all accessibility features that a compositor should implement
itself. It could also provide interfaces for plugging accessibility
features implemented in separate processus (e.g. screen reader). To
avoid keyloggers and such, such interface needs to be available only to
trusted accessibility processes.
About the screen reader shortcuts, one issue we have with the current
implementation is that for each keyboard event the toolkit has to wait
for the screen reader to tell whether it ate the event or not. That
makes all keyboard events get delayed. This should probably be replaced
by a registration mechanism: the screen reader tells which shortcuts it
is interested in, and only the events for them are captured and provided
to the screen reader, asynchronously.
Opinions on the whole thing?
More information about the wayland-devel