Keysym event in the text protocol

Pekka Paalanen ppaalanen at gmail.com
Fri Jul 25 06:06:50 PDT 2014


On Wed, 23 Jul 2014 09:46:16 +0700
Trung Ngo <ndtrung4419 at gmail.com> wrote:

> Hi guys,
> 
> In the text protocol, there is a `keysym` event (and a corresponding 
> `keysym` request in the input-method protocol). In the spec, it is used 
> to 'notify when a key event was sent.' If I understand correctly then 
> the whole point of this request/event pair is to fake a key press from 
> the input method. If so, shouldn't it make more sense to intercept the 
> keysym request at the compositor and send a key press event to the text 
> application instead of passing the keysym event to the text application 
> (no more keysym event in the text protocol)?
> 
> In the current design, the text application has to listen to the keysym 
> event (for fake keys) and implement the key handler (for 'normal' keys) 
> at the same time, potentially duplicating code and opening up the 
> posibility that some applications forget to implement the keysym event 
> handler.

I'm no expert on input, but that is all deliberate.

"Normal keys" are direct user actions, a person pressing a key on a
keyboard. These originate from a real physical action, specifically a
key press/release. These use key codes, which in the clients are
interpreted through libxkbcommon to take account the keymap and
whatnot. If a keymap does not provide some keysym, you cannot input it,
AFAIU.

Input methods however use whatever (even complicated) UI that lets the
user to choose which symbol to enter. For instance, it can be a virtual
keyboard with all kinds of exotic symbols, poked with a finger. There
are no physical keys to press. Or it could be just the "compose key"
mechanism. Or a machine vision system interpreting sign language.

These are fundamentally very different kinds of input. We have a strict
separation in the protocol design between actual physical user actions
(e.g. poking a button) and "fake" events, and we intend to keep these
two separated all the way. For example, wl_pointer.motion is sent only
when the user moves the pointer, not when e.g. the window moves under
the pointer or the pointer gets warped programmatically.

The burden of implementation is in toolkits, not applications.


Thanks,
pq


More information about the wayland-devel mailing list