Wayland Relative Pointer API Progress

Michal Suchanek hramrach at gmail.com
Fri Apr 17 02:47:51 PDT 2015

On 17 April 2015 at 09:11, Pekka Paalanen <ppaalanen at gmail.com> wrote:
> On Fri, 17 Apr 2015 13:43:11 +0900
> x414e54 <x414e54 at linux.com> wrote:
>> Thank you for the comments.
>> I do have a few counterpoints but I will leave after that.
>> >
>> > Not sure an IR/laser/wii mote pointer should even be considered a
>> > "relative" pointer since they operate in absolute coordinates. Given
>> > this, there is no "set position" hint to consider. Transmitting
>> > acceleramoter data via a "relative pointer" doesn't sound reasonable.
>> >
>> I think this is the issue right here. Pointers are not relative, mice
>> are not pointers.
> What definition of a "pointer" are you using?
> The definition Wayland uses for a wl_pointer is a device that:
> - requires a cursor image on screen to be usable
> - the physical input is relative, not absolute
> This definition is inspired by mice, and mice have been called pointer
> devices, so we picked the well-known name "pointer" for mice-like
> devices.
> Specifically, a pointer is *not* a device where you directly point a
> location on screen, like a touchscreen for example. For touchscreens,
> there is a separate protocol wl_touch.
> For drawing tablets, there will be yet another procotol.
> Joysticks or gamepads fit into none of the above. For the rest of the
> conversation, you should probably look up the long gamepad protocol
> discussions from the wayland-devel mailing list archives.

And how is a joystick different from a trackpoint, exactly?

It uses different hardware interface and later different software
interface but for no good reason. It's just 2 axis relative input
device with buttons. Sure, the big joystick, gamepad directional cap
and trackpoint are at a different place of the stick size scale and
might have different hardware sensors which should be reflected with
different acceleration settings but ultimately it's the same kind of

> A fundamental difference between a wiimote and a pointer, as far as I
> understand, is that wiimote might be off-screen while a pointer never
> can. You also would not unfocus a wiimote from an app window just
> because it went off-screen or off-window, right? Button events should
> still be delivered to the app? A Pointer will unfocus, because without
> grabs, the focus is expected to shift to whatever is under the pointer.

And why should wiimote not unfocus unless grabbed?

I am not sure how wiimote actually works but from your comments it
seems it's some absolute pointing device with buttons. I should be
able to use an absolute pointing device with buttons as pointer input
if I choose so. In fact, I am using my Wacom tablet that way right now
in X11 which happens to be an absolute pointing device with buttons.
And due to aspect mismatch my pointer can technically go off-screen.
And I will not change to a windowing system that does not allow that.
Similarly I should be able to map the Wacom tablet for exclusive use
with a particular application window or the application window
currently in focus. I do not see any reason why the wiimote should be
special and different and only allow mapping to a particular



More information about the wayland-devel mailing list