Wayland compositors and accessibility

Olivier Fourdan fourdan at gmail.com
Wed Apr 3 12:39:29 UTC 2019

Hi Samuel,

On Wed, 3 Apr 2019 at 13:57, Samuel Thibault
<samuel.thibault at ens-lyon.org> wrote:
> I would like to open a discussion on accessibility features on Wayland.
> Several things that happened to work on Xorg (sometimes rather by luck)
> do not work any more on Wayland, so we need to find proper ways to
> implement them on Wayland.
> The needs are mostly on the input side. To avoid an XY problem I have
> gathered a list of the *end-use* cases on
> https://www.freedesktop.org/wiki/Accessibility/Input/
> as well as ideas of how they could be implemented.
> Basically, the technical requirements we seem to end up with is:
> - some support (such as AccessX) needs to capture some input events
>   and emit other input events.
> - some support (e.g. key feedback) only needs to get notification of
>   input events (can be asynchronous), both actual keypresses and
>   composition / input method results.
> - some support (e.g. virtual keyboard) only needs to be able to emit
>   input events.
> - some support (screen reader shortcuts) needs to catch pressure of a
>   set of keyboard shortcuts.
> Most of these were previously implemented on Xorg by just snooping input
> with XEvIE and synthesizing with XTest, and when XEvIE was removed the
> snooping was integrated into toolkits themselves. These interfaces are
> frowned upon at least for security reasons, but also some performance
> reasons.
> Some of the features can be implemented in the compositor (e.g. AccessX
> is already implemented that way), some of these however should probably
> not (e.g. a screen reader).
> What we currently have in Wayland is support for AccessX, directly
> implemented in mutter.  The rest of accessibility features are currently
> mostly broken.

Well, instead of "mostly broken", I'd say it's a WIP on the GNOME
side. To recap the state of the accessibility features in GNOME:

 * Keyboard accessibility implemented in mutter (you mentioned it) -
That includes the AccessX features previously found in the Xserver.
That also includes mousekeys.

 * Mouse accessibility features from GNOME mousetweaks (aka "Click
Assist") is being worked on, see these MR:


 * locate-pointer is also being worked on (not sure locate-pointner
qualifies as a real accessibility feature):


 * For at-spi on Wayland, there is no global coordinates and no plan
to add them. So for being able to generate pointer events without
global coordinates and keyboard events without XTest for a replacement
of dogtail which would work with GNOME on Wayland, there is ponytail:


  Maybe that could be useful for accessibility as well?

> Thinking of the implementation of AccessX support, it is currently
> only within mutter, and not in other compositors, which is no good
> for having accessibility supported on all systems.

That's part of the problem with Wayland, previously we had one single
implementation of the display server which was Xorg and all window
managers and desktops would inherit the same features provided by
Xorg, whereas now every Wayland compositor is a display server.

That's why we have helper libs such as libweston and wlroots for example.

> We could think
> of moving the implementation to a shared library that compositors
> would use, with the same principle as libinput.  Such library could
> implement all accessibility features that a compositor should implement
> itself.

Yet, I think this is quite different from what libinput does. These
features are very deeply rooted in the event loop of the compositor,
in the case of mutter, this is clutter.

Reason for this is because it need to be able to delay, replace or
inject both pointer (mousekeys) and keyboard events (slowkeys,
stickykeys, bouncekeys).

It also interfere with xkb state for modifiers (stickykeys). I reckon
moving the logic out of the compositor is certainly doable, but would
not necessarily make the code much simpler for other compositors...

> It could also provide interfaces for plugging accessibility
> features implemented in separate processus (e.g. screen reader).  To
> avoid keyloggers and such, such interface needs to be available only to
> trusted accessibility processes.

That does not need to be Wayland protocols though, these could be DBUS.

But I guess we need to define what a "trusted accessibility processes"
is and how the compositor can enforce that.

> About the screen reader shortcuts, one issue we have with the current
> implementation is that for each keyboard event the toolkit has to wait
> for the screen reader to tell whether it ate the event or not. That
> makes all keyboard events get delayed. This should probably be replaced
> by a registration mechanism: the screen reader tells which shortcuts it
> is interested in, and only the events for them are captured and provided
> to the screen reader, asynchronously.
> Opinions on the whole thing?

If the screenreader issue is just about keyboard shortcuts, then a
DBUS interface would be doable.


More information about the wayland-devel mailing list