Wayland Relative Pointer API Progress
jadahl at gmail.com
Thu Apr 16 20:30:16 PDT 2015
On Fri, Apr 17, 2015 at 11:51:41AM +0900, x414e54 wrote:
> I am wondering if there has been any recent progress on the stability of
> the Wayland relative pointer API?
AFAIK it is being / to be reviewed a bit more. I have some kind of plan
to submit a version with the previous issues addressed, but haven't
gotten to it yet.
> I had a few ideas on the original December implementation:
> Whilst we definitely need the relative event support, I feel that the
> client should never be allowed to warp or confine the global pointer even
> if it is just a hint. There may be cases such as IR or laser pointer
> devices (e.g. wii mote) which can never guarantee a warp or pointer
> confinement but can still transmit accelerometer data as relative pointer
> motion. Looking at most toolkits and applications they cannot be trusted
> even with "hints".
Not sure an IR/laser/wii mote pointer should even be considered a
"relative" pointer since they operate in absolute coordinates. Given
this, there is no "set position" hint to consider. Transmitting
acceleramoter data via a "relative pointer" doesn't sound reasonable.
> I think the API needs to be split into two use cases:
> 1. GUI sliders etc. - They should be allowed to freeze the pointer and
> receive relative events based on an implicit button down grab similar to
> the drag and drop protocol. They should not be allowed to warp or confine
> and upon button up the grab is lost. Gnome already looks like it uses
> heuristics todo this if the cursor is hidden on button down.
Sliders etc will be possible with the pointer lock and relative pointer
protocols. Confinement has other use cases.
> 1. Games - They do not really need relative "pointer" events they just want
> the current seat mapped to a 6DOF or joystick style input. They should be
> allowed to request some kind of wl_joystick or wl_6dof interface. Then the
> compositor can decide what it actually presents to the application for
> using that input. Maybe a user has a joystick they always select to use
> instead of their mouse or they have an accelerometer device etc. It is then
> up to the compositor what it does with the actual on screen cursor if it
> confines it or hides it etc, there could be a notification of entering
> "game mode" etc. If the compositor is not using the same input device for
> wl_pointer and a wl_joystick or wl_6dof then it does nothing. This would
> also allow a user to hot-swap the device between mouse and keyboard and a
> gamepad just by using the WM settings. It could also allow for using 6DOF
> or 3D mice in an application which is also mapped as the default x, y
> The application will then still receive absolute pointer events which it
> can use for in game GUI clicks.
Well, they do. Sure, consoles don't tend to use a mouse for input, but
on PC, a very large amount of games do. And a large amount of those tend
to use it in "relative mode" i.e. not to show a cursor (Quake,
Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
pointers. Currently games usually rely on opening the evdev device
themself, and so far it doesn't seem reasonable to abstract such devices
in the compositor. What may make more sense is to rely on the compositor
to handle focus, passing fds around, continuing to make the client
responsible for translating input events to character movements or
It doesn't make any sense to compositor driven hot-swap what input device
type a game should use. Many games probably wouldn't even support
changing from a mouse+keyboard to a gamepad simply because the gameplay
would be different. A game needs to decide itself what input device type
it should use.
Emulating pointer devices from controller input I think is completely
out of scope for any protocol. It can be done by something server side
> Any opinions on this would be appreciated.
More information about the wayland-devel