[RFC] Global binding

Quentin Glidic sardemff7+wayland at sardemff7.net
Mon Dec 30 18:34:05 PST 2013

On 31/12/2013 02:02, Bill Spitzak wrote:
> Quentin Glidic wrote:
>> I use a mouse button as my Push-to-Talk button. This button is
>> detected as “History back” by my web browser, and *I want to have
>> both working.*
> I think you are confusing what I was complaining about (which was
> the idea that there is some difference between your bind "actions"
> and the events sent from the compositor to clients. Basically if the
>  compositor is going to do a lot of decoding, then decoded result
> should be sent to the clients in events).

There is no difference, but these events are *specific* and the client
knows them to be global actions.

> For your request, I think whether a binding "eats" the event can be
> a fixed setting for each gesture. Ie mouse buttons and modifier keys
>  are not eaten, all keys that produce text are eaten, etc.

Buttons and keys are *the same* here, I do not want any to be eaten.
There is a real life example of that: an AddOn for WoW that monitors
your Push-to-Talk key to send your team mates you talking status.
It may be a *setting* so that the user is in charge. It may default to
“eat event”. But nothing should assume that the event is eaten.

> I agree that translating "gestures" should be done by the
> compositor.
> But I would like to see the simplest gestures attacked first: the
> "gesture" of pushing a button with the text "FOOBAR" on it should
> send an event containing the text "FOOBAR" to the client!

I strongly disagree. As I said, I do not want actions to be (assumed)
eaten, and just using the key name is a security issue here.

> Or just maybe all the gestures should be done by the client. That
> would be consistent at least. I would prefer that the compositor
> does it all however.

I do think the compositor should handle them. Even more, it is likely 
that the compositor will use some of them for its own features.

>>> Events should certainly be decoded to more like the xkeysym
>>> level
>> AFAIU, the decoding is done by libxkbcommon and it was designed so
>> that clients would have to support that explicitly. This has
>> nothing to do with global bindings.
> I think that is a serious mistake. The biggest problem I have with
> remote X (both NX and normal $DISPLAY) is bungled scancodes because
> the local x server has to backwards translate key events into phony
> scancodes. When I log in remotely I always have to run xmodmap with
> a special table to fix this. This is obviously a buggy hack and
> Wayland should fix it.

And I am just saying what I understood on this point. I have no opinion

>> Please note that music and video players are not currently using
>> multimedia keys directly: mplayer and VLC are using the space bar
>> as play/pause (while focused, of course), VLC uses the mouse wheel
>> to control volume. This is a good design, to keep the global keys
>> usable at the system level (e.g. controlling the PCM volume).
> I want the multimedia app to be able to decide whether the volume
> up/down adjust it's volume or the global one. What I proposed was
> that all events go to the clients and they can say they did not
> handle them (thus the volume buttons go to the focused media player,
> but adjust the global volume if the focues app ignores them).

And I do not want the system behaviour to heavily depends on some random

> I think there is dislike for this idea, so I now propose that it be
> moved to the binding api.

I do not understand this sentence at all, sorry.

>> That would require a round trip and that would be wrong is some
>> (many?) cases: I have two music players A and B, I focused B last,
>> why would A have to decline the action if it can handle that
>> perfectly, since it does not have the knowledge that B exists at
>> all?
> It would be sent to B first (as you proposed, they are sent to the
> last-focused client). The only change I was making is that B could
> say "I did not use it" and then (perhaps) A gets it.

So, player B do not want to handle that, and keep playing its music. The 
compositor then sends the action event to player A which will start 
playing music. You now have two applications playing music because B is 
misbehaving. I think it is just plain wrong.


Quentin “Sardem FF7” Glidic

More information about the wayland-devel mailing list