[Accessibility] Need to be able to register for key events globally
matthias.clasen at gmail.com
Wed Dec 11 18:41:53 PST 2013
On Tue, Dec 10, 2013 at 3:29 PM, Piñeiro <apinheiro at igalia.com> wrote:
> GNOME Assistive Technologies need to be able to listen to key events
> globally and have the possibility of consuming them. Example use
> * Orca's presentation of navigation (Events not consumed)
> - Right Arrow: Speak character moved to (right of cursor)
> - Shift Right Arrow: Speak character selected (left of cursor)
> - Down Arrow: Speak next line
> - Etc.
> * Orca's navigational commands (Events consumed)
> - H/Shift + H: Move amongst headings
> - NumPad 8: Speak the current line
> - NumPad 5: Speak the current word
> - NumPad 2: Speak the current character
> - Etc.
> Current solution: The Orca screen reader calls AT-SPI2's
> atspi_register_keystroke_listener(). AT-SPI2 then notifies Orca of key
> events it receives from the toolkit implementation of ATK method
> atk_add_key_event_listener(). Applications then have to wait for Orca
> to consume the event or not. This requires two DBUS messages. Toolkit
> authors want to abolish this. That's fine, *if* we have an
> alternative. Do we?
As Bill says, input methods already have a private protocol for
intercepting and processing input events on the server side, and a similar
facility could be added to the private protocol for ATs. And again, having
at-spi using that private protocol and then offering key snooping to
everybody over dbus would negate an advantage of Wayland, so the user of
the private protocol should be the actual AT, not some multiplexing
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the wayland-devel