回复: Re: get input device in wayland

Peter Hutterer peter.hutterer at who-t.net
Fri Nov 24 09:15:33 UTC 2023


On Fri, Nov 24, 2023 at 11:00:34AM +0800, weinan wang wrote:
> Thank you, I think I understand why the wayland-protocol is designed
> the way it is.
> Finally, I would like to ask if there is a input mapping protocol upstream is
> considering to add like "xinput map-to-output".
> Tablet, touch screen, and other absolute device mapping functions
> are currently implemented separately in different compositors,
> such as kwin has added related functions in dbus

there is no protocol planned or at least I'm not aware of it, it will be
a compositor-specific configuration only.

It all requires a bit of a mindshift - think of Wayland like HTTP. You
don't configure your browser via HTTP requests from the websites, that
is done through other channels, for a whole bunch of reasons.

Cheers,
  Peter

> At 2023-11-24 09:11:34, "Peter Hutterer" <peter.hutterer at who-t.net> wrote:
> >On Wed, Nov 22, 2023 at 06:52:47PM +0800, WeiWei wrote:
> >> Thank you very much, I will discuss this idea with the designer.
> >> 
> >> I'd like to ask, why the underlying device is not exposed in
> >> current wayland-protocols, what are the benefits of doing this.
> >
> >The overwhelmingly vast majority of applications don't need to know
> >about the specifics of the device, so exposing it is both not necessary
> >and may also lead to interference if the applications try to do things
> >based on assumptions that a compositor may not meet.
> >
> >X did expose the devices and in addition to making it a nightmare to
> >maintain it is also effectively unused. Plus there's a whole can of
> >niche bugs that can get exposed that way - e.g. if you're listening for
> >button events on a specific device you can get out of sync with what the
> >compositor treats as logical button state.
> >
> >> With keyboard and pointer, we don't usually care about 
> >> physical device, but for touch devices, each touch device
> >> will work independently and need input mapping and calibration,
> >> is it possible to consider adding some device information
> >> in wl_touch in the future.
> >
> >The general consensus is that touch mapping and calibration should be
> >part of the compositor and/or a side-channel to the compositor if the
> >calibration needs to be done in an external application.
> >
> >Cheers,
> >  Peter
> >
> >> 
> >> 
> >> 从 Windows 版邮件发送
> >> 
> >> 发件人: Peter Hutterer
> >> 发送时间: 2023年11月22日 14:45
> >> 收件人: weinan wang
> >> 抄送: wayland-devel at lists.freedesktop.org
> >> 主题: Re: Re: get input device in wayland
> >> 
> >> On Wed, Nov 22, 2023 at 09:22:56AM +0800, weinan wang wrote:
> >> > Thanks for reply,
> >> > We want to achieve an application for touch input map,
> >> > it will map touch input to a output device when we have multi-screen.
> >> > 
> >> > The input map need provide a input device and a output device,
> >> > in our design, we will traversal all screen to ask user "is this screen you
> >> > want to map", and user choice a touch device by tap on a touch device,
> >> > just like "touch config" feature in Windows11.
> >> > 
> >> > All in all, we want to achieve an application for touch input map, if have a batter
> >> > design, i will discuss it with the designer.
> >> 
> >> There is no absolutely surefire way to do this as a pure Wayland
> >> application but one possible UI flow would be:
> >> 
> >> - getting the list of touchscreens from udev [1]
> >> - printing something like "Mapping screen 'Foo Vendor Touchscreen'" 
> >> - display on output A with icons "this one", "nah, next one please"
> >> - rotate through outputs until "this one" is clicked
> >> 
> >> You don't need to care whether the confirmation even comes from the
> >> touchscreen.
> >> 
> >> Alternatively you can display on all screens simultaneously and assume
> >> the button + screen that receives the "Yes" click are the one that want
> >> the touchscreen.
> >> 
> >> Cheers,
> >>   Peter
> >> 
> >> [1] slightly unreliable since there's no guarantee they're all handled
> >> but most likely good enough
> >> 
> >> > At 2023-11-22 06:31:07, "Peter Hutterer" <peter.hutterer at who-t.net> wrote:
> >> > >On Tue, Nov 21, 2023 at 04:20:10PM +0800, weinan wang wrote:
> >> > >> Hey Guys,
> >> > >> We have an application that needs to get the touch device
> >> > >> corresponding to the touch event, and on X11, we can get the device ID
> >> > >> that sends this event according to the XI_Touch events, and then find
> >> > >> the device we want according to this ID.  But on Wayland, it seems
> >> > >> that neither wl_seat nor wl_touch can get device-related information,
> >> > >> this application is a user permission, so we can't use libinput to get
> >> > >> more low-level events.
> >> > >> Please ask if there is any other way to get the device that sends the
> >> > >> event?
> >> > >
> >> > >This is not possible in wayland, physical devices are abstracted into
> >> > >wl_pointer/wl_keyboard/wl_touch and the underlying device is not exposed
> >> > >anywhere through wayland or even current wayland-protocols.
> >> > >
> >> > >The sole exception is the wl_tablet protocol which is per physical
> >> > >device but that's due to how tablets are being used.
> >> > >
> >> > >You don't usually have access to the libinput context inside the
> >> > >compositor either (because Wayland itself doesn't require libinput, it's
> >> > >an implementation detail).
> >> > >
> >> > >The question here is: what are you trying to achieve? maybe there's a
> >> > >different way to do it than having access to the specific physical
> >> > >device.
> >> > >
> >> > >Cheers,
> >> > >  Peter
> >> 


More information about the wayland-devel mailing list