[RFC wayland-protocols] inputfd - direct input access protocol

Bastien Nocera hadess at hadess.net
Tue Apr 4 09:17:54 UTC 2017


Hey Roderick,

On Mon, 2017-04-03 at 16:08 -0700, Roderick Colenbrander wrote:
> Hi Peter,
> 
> Thanks for sharing this proposal. I have some little comments for a
> later stage, but would rather discuss some big items first.
> 
> The feedback I will share comes from a couple different angles as we
> are interested in Wayland for various different use cases.
> 
> Originally we had the gamepad wayland thread originally started by
> Jingkui. The proposal there was to provide knowledge of axes, buttons
> and everything making up a gamepad to the compositor. He proposed
> this direction as a way to inject events from Chrome into Android as
> at that level there is no file descriptor anymore. Ourselves we were
> quite interested in this approach as well. Among the things we do is
> roughly remote streamable desktops. Currently on X you can inject
> keyboard/mouse events through XTest, but the same applies to a custom
> Wayland compositor into which you can easily inject keyboard/mouse
> data. Clients don't have to be aware of the custom virtual devices
> and protocols behind the scenes.

The main reason why you wouldn't want to use a custom protocol is that
we haven't been able to add support for joypad events in X11 in nearly
30 years, and, on Linux, the "evdev" joystick API offered by the kernel
matched pretty well the requirements from application/game developers.

We could add a layer on top of evdev, but we'd end up copying all of
evdev's features. The reason why we want an API in Wayland to pass the
file descriptor around is:
- we want only the focused app to have access to the joystick, this
makes it easier to have multiple "apps" access the joystick
- we want to be able to discover and receive events when new joysticks
become available, without access to udev or equivalent as we'd be
sandboxed

With that said...

> When using inputfd for such a use case, either we would need to fake
> an input device through uinput (not ideal) or add a new 'fd_type' and
> have a custom protocol across this. Clients or input libraries would
> need to be able to deal with this other protocol, which would be a
> big pain. For such remote desktop use cases and even the Android one
> inputfd is not ideal.

I don't quite understand which parts of this system are in your
control, and which ones aren't, what is generic, and what is custom.
I'm guessing that the OS and compositor (Chrome OS, or Android) have
access to the raw device which is available locally, but the (custom)
application would send events through the network to the remote
application. Is this correct?

For Linux desktops, I'd expect toolkits (SDL and GTK+) to be first in
the list of modified code to handle the new protocol. In SDL's case,
this replaces the udev backend code for joysticks.

> 
<snip>
> The current proposal could work for exposing basic gamepad
> functionality for the axes/buttons.

Not having a layer in between the kernel and the app means that adding
new features in the kernel and supporting them in the app requires just
two source trees to be modified. The upstream evdev API *is* the API.

>  Taking hid-sony as an example ds3 and ds4 devices are now threated
> as composite devices utilizing multiple device nodes. The ds4 is
> probably the best example using 3 evdev nodes, 1 for gamepad, 1 for
> touchpad (reported as a pointer device) and 1 for motion sensors. I
> expect the Switch Pro controller and the Joypads for the Switch to
> use at minimum 2 nodes as well.

I recently added support for the PS3 uDraw tablet, it also has a
drawing tablet node on top of the 3 you mentioned above ;)

Peter mentioned the device group identifiers to be able to tie them
together.

> How would composite devices be handled? There would need to be a way
> to tie multiple devices together. So far composite devices use all
> the same ids (product/vendor) and same unique id (EVIOCGUNIQ) and
> even physical location (EVIOCGPHYS). At minimum I think the
> compositor needs to tie these together with some sort of shared id. 

The compositor is also outside a sandbox, and can tie them together as
those devices have the same sysfs parent device.

> Another issue again taking ds4 as an example it has a touchpad, which
> is already picked up as a pointer device elsewhere. How to handle
> this? Unflag it as a pointer and share this inputfd as well or use it
> really as a pointer and gamepad state the other way?

How do you access it now, and stop the OS from handling it as a system
pointer? Do you set up a grab, and collect the events as passed through
the Wayland protocol, or would you prefer having raw access to the
device node?

> In addition next to evdev nodes there are sysfs nodes for which fd
> passing won't work. The best example would be LEDs as found on
> ds3/ds4/xbox360, but I'm sure there will be GPIO as well. In case of
> ds4 it is not uncommon to update the LEDs many times a second (often
> used for light effects). How would such features be exposed? Is the
> idea to leverage properties or something like this? Passing these
> nodes to clients is probably not a good idea.

I'd expect the LEDs to be handled by the OS, or the protocol to be
enhanced to be able to set those.

> This is some initial feedback. I'm not sure how much I like the
> inputfd proposal. For a part because it doesn't fit some of our use
> cases well (remote desktop + custom protocols). In addition I'm also
> a bit worried about the complexity in handling composite devices.

I think it's actually much better for you. The original protocol was
another protocol on top of the standard one for Linux ("evdev"), it
crippled the abilities offered by the kernel (no rumble, no touchpad,
no accelerometer, no gyro, no LEDs, no separate rudders, etc.). It
barely handled an "XBox 360" level controller, and was definitely
cutting functionality from even a PS3 controller, let alone a PS4 one.

It also had absolutely no support for handling composite devices. This
proposal was just about good enough to implement the gamepad HTML5 API,
it was in no way good enough to handle all the cases offered by native
games.

Let me know whether I forgot something, or have any other questions.

Cheers


More information about the wayland-devel mailing list