[RFC wayland-protocols] inputfd - direct input access protocol

Daniel Stone daniel at fooishbar.org
Wed Apr 5 09:25:17 UTC 2017


Hi,

On 5 April 2017 at 04:40, Carsten Haitzler <raster at rasterman.com> wrote:
> On Tue, 04 Apr 2017 14:45:13 +0200 Bastien Nocera <hadess at hadess.net> said:
>> There's two possible solutions to this problem:
>> - evdev gives you the ability the mask certain events. The compositor
>> can keep one fd open masking everything but the power/menu/etc.
>> buttons, and pass an fd to the app with just the power/menu/etc.
>> buttons masked. The compositor can then choose to do something special
>> with those buttons
>> - you can specify a new fd_type, as mentioned in the spec, which your
>> application would need to know how to handle. That can be used to
>> implement the simpler protocol that got sent a couple of months ago.
>
> then explain to me the argument where for example keyboard input should NOT
> also then do the same thing?

Because, as stated up thread, keyboard and mouse devices involve
desktop interaction. So there we have our reason why it is essential
for the compositor to intercept _all_ keyboard and mouse events (any
keyboard event could be Alt-Tab, any mouse event could be moving out
of focus, etc: you cannot filter individual events), which does not
hold true for gamepad/joystick events (the filter suffices).

Secondly, keyboard and mouse events work completely differently. Mouse
events have acceleration applied, and the client's view of
acceleration _must_ match that of the compositor's in order to achieve
the same view of position. I don't much feel like encoding an
acceleration algorithm in the protocol for all time, just because it
seemed like a good idea from 50,000ft.

Keyboard events carry state which is longer-lived than focus. For
example, press caps lock and change your focus. (Please no-one smartly
chime in about ctrl:nocaps, which FTR I also use.)

The compositor _must_ interpose every single keyboard/mouse event, and
they are simple enough that it is possibly to easily encode them with
universally-accepted concepts. Neither is true of gamepads or
joysticks. Hence, a different protocol.

>> > if that's
>> > good enough then why don't we do this for kbd and mouse too? pass
>> > fd's and fake
>> > a kbd or mouse device from the compositor if we want to filter...
>> > (yes yes i
>> > know about mouse coordinate remapping...).
>>
>> A couple of reasons. The devices are probably "simpler", or at least
>> protocol developers (be it for X11 or Wayland) felt they were.
>
> They are not really much different.

Oh, but they are. It might be instructive to go back several years
when the input protocols were much more simple, and complexity of
interaction forced us to the protocol we have today. I do not know of
any proposed usecase for gamepads/joysticks where this holds true.

>> The problem space is better understood, and while the kernel takes care
>> of a lot of the differences between devices (at least we don't need to
>> figure out which keycode corresponds to which US layout keysym), there
>> were still tons of problems that we ran into as those devices changed:
>> - we couldn't have more than 256 different keys (minus a bunch of
>> reserved ones) in X, breaking a lot of multimedia keys.
>
> umm actually it was 128. the MSB was press vs release. and having more keys
> this was solved via escapes (send multiple codes in a row with a special
> beginning escape value) from memory.

No, Bastien really does mean 256. Or 248. X11 has a fixed keycode
range, for which the minimum is 8 and the maximum is 248; they are not
inherently composeable, at a protocol level at least. AT
electrical-level encoding is not relevant here.

>> It really isn't, sorry. In fact, I'm pretty certain that you can
>> implement a gaming console's interface far more easily with this
>> protocol than with the original one.
>
> how so? all a gamepad/jostick etc. protocol has to do is pass on the same data
> as the kernel device. unlike a kernel device it'd at least be portable as a
> different OS may or may not support the same devices. if we're doing the "well
> we'll emulate a linux evdev device then on all platforms" then why not just do
> so with keyboard input too? why not so this everywhere and just have an fd per
> input device rather than wayland protocol.
>
> it's all inconsistent this way. some input is protocol. some is a special
> fd with a device-driver level protocol not a wayland one.
>
> if the kernel devices are so well done to "just expose them" as a fd then why
> not simply transfer that protocol into a wayland protocol event and be done
> with it? at least we're then consistent with the way input events are done.
> that or dump wayland protocol input device events and do them the device fd
> way...

It's inconsistent because the requirements of compositor and client -
i.e. each end of the protocol - are different.

I don't see anything here which would preclude the use of this
protocol for supporting remote/gamepad input. If a compositor's
'internal window' / UI is focused, then it can use the full
gamepad/joystick for navigation as any other client would. If you need
to reserve certain keys for global use (à la the Xbox button), then
you can do that just fine. If you want to use the Konami code
globally, then it's no better or worse with inputfd than explicit
protocol, and it's also not exactly sensible.

t;ldr: explicit interposition hurts, we only did it for keyboard/mouse
because of strict necessity, it doesn't seem at all necessary here, we
did it unnecessarily in X11 and it added massive complexity & inertia
for no gain, I like the idea of the protocol as presented (NB: have
not actually read the XML line by line)

Cheers,
Daniel


More information about the wayland-devel mailing list