[RFC DRAFT] graphics tablet protocol extension

David Herrmann dh.herrmann at gmail.com
Wed Oct 2 08:44:29 PDT 2013


Hi Peter

On Fri, Sep 20, 2013 at 12:35 PM, Peter Hutterer
<peter.hutterer at who-t.net> wrote:
> I've been working on a protocol extension to support graphics tablets such
> as the Wacom set of tablets, and I'm now at the stage where I'd like a few
> comments. I was hoping that I'd get a full implementation before XDC but
> unfortunately that didn't happen, so for now I'll just show the protocol.
> I've got a PoC implementation, but it's missing a few too many pieces to be
> actually usable just yet.
>
> Any feedback appreciated. This is a relatively early stage, so there are
> still many changes expected.
>
> The goal is to make it possible to access graphics tablets. One important
> thing to note is that this interface does _not_ cover the touch ability of
> some tablets. This should go through wl_touch (for touchscreen-like tablets)
> or wl_pointer (for external touchpad-like tablets).
>
> There are a few notable differences to the wl_pointer interface:
> * tablets have a tool type that matters, it lets applications such as the
>   GIMP select paint tools based on the physical tool. those tools also often
>   have HW serial numbers to uniquely identify them.
> * extra axes mattter: pressure, tilt, distance - all influence the stylus
>   behaviour
> * more than one device may be present, so it's important to have access to
>   all devices on a one-by-one basis. unlike wl_pointer, where we just have
>   one virtual pointer.
> * proximity matters since we can leave proximity from directly above a
>   surface. the pointer can't do that, it moves from one surface to the next.
>   so in some ways it's closer to wl_touch in that regard.
>
> Some design notes:
> * generally most axes change at the same time, hence the choice to send a
>   wl_array instead of separate events.
> * x/y would have to be adjusted relative to the surface, but technically the
>   same would have to be done to e.g. distance on a true 3D desktop.
> * not sure at all about the relative events at all or if there's a need for
>   it. iirc only some styly have REL_*WHEEL, do we need something else? Ping,
>   Jason?
> * I don't have a specific touch event, I figured BTN_TOUCH would do the job.
> * focus handling for the stylus is easy. focus handling for the buttons on
>   the pad isn't. they could technically be focused elsewhere, like a
>   keyboard focus. some buttons are definitely stylus based (BTN_STYLUS,
>   BTN_STYLUS2, etc.) so should go where the stylus is. Should look at what
>   Win/OSX do here.
> * bind/binding/unbind - this is like windowed mode in GIMP. do we still need
>   this? who's actually using this instead of a full-screen app?
> * tablet_manager is a bit meh, but the only alternative would be to have a
>   wl_seat::get_tablets request and a wl_seat::tablet_added event. Possible,
>   but doesn't look that much nicer. but it does away with the indirection.
>   (read the diff below to understand what I mean here)
> * if we stick with the tablet_manager, do we need a
>   wl_tablet_manager::get_tablets request in case the client releases a
>   tablet it needs again later? or do we expect to re-bind to
>   wl_tablet_manager?
> * fuzz/flat should be dropped, I just haven't yet.
> * I'd really like to enscribe in the protocol that ABS_PRESSURE means just
>   that and damn anyone else who wants to use 0x18 as axis code. weston
>   does this for wl_pointer::button, but it's not actually documented. what's
>   the deal here?
> * does this even make sense as wl_tablet or should I try first as
>   experimental weston-tablet interface that then (maybe) moves later to
>   wayland proper.
>
> That's it so far, again, any feedback appreciated. diff below.
>
> Cheers,
>    Peter
>
>
> diff --git a/protocol/wayland.xml b/protocol/wayland.xml
> index aeb0412..8d10746 100644
> --- a/protocol/wayland.xml
> +++ b/protocol/wayland.xml
> @@ -1235,7 +1235,7 @@
>      </request>
>     </interface>
>
> -  <interface name="wl_seat" version="3">
> +  <interface name="wl_seat" version="4">
>      <description summary="group of input devices">
>        A seat is a group of keyboards, pointer and touch devices. This
>        object is published as a global during start up, or when such a
> @@ -1251,6 +1251,7 @@
>        <entry name="pointer" value="1" summary="The seat has pointer devices"/>
>        <entry name="keyboard" value="2" summary="The seat has one or more keyboards"/>
>        <entry name="touch" value="4" summary="The seat has touch devices"/>
> +      <entry name="tablets" value="8" summary="The seat has one or more graphics tablet devices" since="4"/>

What's actually the reason to allow multiple graphics-tablets per
seat? I thought wl_seat objects respresent a single user interacting
with your desktop. So if you have two tablets, why not force them to
be in two different wl_seat objects? Is there ever a reason to have
multiple tablets in a single seat? What would the use-case be?

We could even go further and put tablets into their own seats. Always.
A pointer and keyboard may be used by a single user at the same time.
But for a graphics-tablet that doesn't sound like a legitimate
use-case, does it? But maybe I just have a different view of wl_seat..
don't know. I somehow have the feeling we never really agreed on how
to define wl_seat objects. Currently they are just user-defined groups
of devices with some specific policies ("only one keyboard-object per
seat").

>      </enum>
>
>      <event name="capabilities">
> @@ -1306,6 +1307,19 @@
>        <arg name="name" type="string"/>
>      </event>
>
> +    <!-- Version 4 additions -->
> +    <request name="get_tablet_manager" since="4">
> +      <description summary="return tablet manager object">
> +        The ID provided will be initialized to the wl_tablet_manager
> +        interface for this seat. This can then be used to retrieve the
> +        objects representing the actual tablet devices.
> +
> +        This request only takes effect if the seat has the tablets
> +        capability.
> +      </description>
> +      <arg name="id" type="new_id" interface="wl_tablet_manager"/>
> +    </request>
> +
>    </interface>
>
>    <interface name="wl_pointer" version="3">
> @@ -1617,6 +1631,223 @@
>      </event>
>    </interface>
>
> +  <interface name="wl_tablet_manager" version="1">
> +    <description summary="controller object for graphic tablet devices">
> +      A tablet manager object provides requests to access the graphics
> +      tablets available on this system.
> +    </description>
> +
> +    <enum name="tablet_type">
> +      <description summary="tablet type">
> +        Describes the type of tablet.
> +      </description>
> +      <entry name="external" value="0" summary="The tablet is an external tablet, such as an Intuos" />
> +      <entry name="internal" value="1" summary="The tablet is an built-in tablet, usually in a laptop" />
> +      <entry name="display" value ="2" summary="The tablet is a display tablet, such as a Cintiq" />
> +    </enum>
> +
> +    <event name="device_added">
> +      <description summary="new device notification"/>
> +      <arg name="id" type="new_id" interface="wl_tablet" summary="the newly added graphics tablet " />
> +      <arg name="name" type="string" summary="the device name"/>
> +      <arg name="vid" type="uint" summary="vendor id"/>
> +      <arg name="pid" type="uint" summary="product id"/>
> +      <arg name="type" type="uint" />

So tablets are server-side created objects? I think we tried to avoid
that. Only wl_display uses this now, afaik. Even for globals we added
the wl_registry interface which just announces unique names which the
client then binds to via wl_registry.bind. So why not turn
wl_tablet_manager into wl_tablet_registry and provide the same
functions as the global registry?

> +    </event>
> +
> +  </interface>
> +
> +  <interface name="wl_tablet" version="1">
> +    <description summary="a graphics tablet device">
> +      This interface describes a graphics tablet, such as Wacom's Intuos or
> +      Cintiq ranges.
> +
> +      Note, this interface does not handle touch input on touch-capable
> +      tablets. Depending on the tablet, touch input is sent through the
> +      wl_pointer (if used as a touchpad) or wl_touch interface (if used as a
> +      touch screen).
> +    </description>
> +    <enum name="axis_type">
> +      <description summary="axis type"/>
> +      <entry name="relative" value="2" summary="relative axis"/>
> +      <entry name="absolute" value="3" summary="relative axis"/>
> +    </enum>
> +
> +    <request name="describe">
> +      <description summary="request axis information">
> +        Request axis information from a device. Whe a client sends a
> +        wl_tablet.describe request, the compositor will emit a
> +        wl_tablet.axis_capability event for each axis on the device.
> +        To mark the end of the burst of events, the client can use the
> +        wl_display.sync request immediately after calling wl_tablet.describe.
> +      </description>

Why not always send the axis information? I mean once the client
requests a tablet_manager doesn't it always want to retrieve a
description? A wl_display.sync directly after receiving device_added
would act as the barrier. The only reason not to send this information
would be to safe traffic if multiple tablets are connected and the
client wants only one of them. Does this really justify the additional
method? I have no idea how big these data-sets really are..

> +    </request>
> +
> +    <request name="release" type="destructor">
> +      <description summary="release the wl_tablet object"/>
> +    </request>
> +
> +

Nitpick: two blank lines

> +    <event name="removed">
> +      <description summary="the device has been removed"/>
> +    </event>
> +
> +    <event name="axis_capability">
> +      <description summary="axis description">
> +        For relative devices, min and max are 0. A resolution, fuzz, or flat of 0
> +        denotes an unknown resolution, fuzz, or flat, respectively.
> +
> +        Note that while the min/max values are announced by the device, the value is
> +        not clamped to the actual range by the compositor. A device may send values
> +        outside this range.
> +      </description>
> +      <arg name="type" type="uint" summary="axis type (relative, absolute)" />
> +      <arg name="axis" type="uint" summary="axis identifier" />
> +      <arg name="min" type="int" summary="minimum axis value, inclusive"/>
> +      <arg name="max" type="int" summary="maximum axis value, inclusive"/>
> +      <arg name="resolution" type="uint" summary="resolution in units/mm"/>
> +      <arg name="fuzz" type="uint"/>
> +      <arg name="flat" type="uint"/>
> +    </event>
> +
> +    <event name="button_capability">
> +      <description summary="button description">
> +      </description>
> +      <arg name="buttons" type="array" />
> +    </event>
> +
> +    <enum name="tool_type">
> +      <description summary="tool types"/>
> +      <entry name="pen" value="0x140" summary="pen tool" />
> +      <entry name="eraser" value="0x141" summary="eraser tool" />
> +      <entry name="brush" value="0x142" summary="brush tool" />
> +      <entry name="pencil" value="0x143" summary="pencil tool" />
> +      <entry name="airbrush" value="0x144" summary="airbrush tool" />
> +      <entry name="finger" value="0x145" summary="finger tool" />
> +      <entry name="mouse" value="0x146" summary="mouse tool" />
> +      <entry name="lens" value="0x147" summary="lens tool" />
> +    </enum>
> +
> +    <event name="proximity_in">
> +      <description summary="tablet stylus proximity event">
> +        This event is sent when a tool comes into proximity of the tablet
> +        above the surface.
> +
> +        If the tablet supports tools with hardware serial numbers, the
> +        tool_serial specifies the serial number. Otherwise, the tool_serial
> +        number is 0.
> +      </description>
> +      <arg name="serial" type="uint"/>
> +      <arg name="surface" type="object" interface="wl_surface"/>
> +      <arg name="surface_x" type="fixed" summary="x coordinate in surface-relative coordinates" />
> +      <arg name="surface_y" type="fixed" summary="y coordinate in surface-relative coordinates" />
> +      <arg name="axes" type="array"/>
> +      <arg name="tool" type="uint"/>
> +      <arg name="tool_serial" type="uint" summary="serial number of the tool, if any"/>
> +    </event>
> +
> +    <event name="proximity_out">
> +      <description summary="tablet stylus proximity event">
> +        This event is sent when a tool goes out of proximity or out of the
> +        surface.
> +      </description>
> +      <arg name="serial" type="uint"/>
> +      <arg name="surface" type="object" interface="wl_surface"/>
> +    </event>
> +
> +    <enum name="button_state">
> +      <description summary="physical button state">
> +        Describes the physical state of a button which provoked the button
> +       event.
> +      </description>
> +      <entry name="released" value="0" summary="The button is not pressed"/>
> +      <entry name="pressed" value="1" summary="The button is pressed"/>
> +    </enum>
> +
> +    <event name="button">
> +      <description summary="stylus button event">
> +       Stylus button click and release notifications.
> +
> +       The location of the click is given by the last motion or
> +       proximity event.
> +        The time argument is a timestamp with millisecond
> +        granularity, with an undefined base.
> +      </description>
> +
> +      <arg name="serial" type="uint"/>
> +      <arg name="time" type="uint" summary="timestamp with millisecond granularity"/>
> +      <arg name="button" type="uint"/>
> +      <arg name="state" type="uint"/>
> +    </event>
> +
> +    <event name="motion">
> +      <description summary="tablet stylus motion event"/>
> +
> +      <arg name="time" type="uint" summary="timestamp with millisecond granularity"/>
> +      <arg name="surface_x" type="fixed" summary="x coordinate in surface-relative coordinates" />
> +      <arg name="surface_y" type="fixed" summary="y coordinate in surface-relative coordinates" />
> +      <arg name="axes" type="array">
> +        <description summary="axis state for other axes">
> +          The state of all absolute axes other than x and y, if any. The
> +          order of coordinates matches the order of axes as sent by the
> +          wl_tablet.axis_capability events. x and y coordinates are not
> +          included, the state starts at the third axis.
> +        </description>
> +      </arg>
> +    </event>
> +
> +    <event name="relative_motion">
> +      <description summary="tablet relative motion event">
> +        A relative_motion event is sent if a relative axis on a device
> +        generates an event.
> +
> +        This event does not describe motion if the device is in relative
> +        mode (i.e. like a touchpad), it only notifies of the relative axes
> +        on the device.
> +      </description>
> +
> +      <arg name="time" type="uint" summary="timestamp with millisecond granularity"/>
> +      <arg name="surface_x" type="fixed" summary="x coordinate in surface-relative coordinates" />
> +      <arg name="surface_y" type="fixed" summary="y coordinate in surface-relative coordinates" />
> +      <arg name="axes" type="array">
> +        <description summary="axis state for other axes">
> +          The state of all relative axes other than x and y, if any. The
> +          order of coordinates matches the order of axes as sent by the
> +          wl_tablet.axis_capability events. x and y coordinates are not
> +          included, the state starts at the third axis.
> +        </description>
> +      </arg>
> +    </event>
> +
> +    <request name="bind">
> +      <description summary="bind the tablet to a surface">
> +        Bind the tablet to the surface. This enables a client to map the
> +        area of the tablet 1:1 to the surface area. Only one client can bind
> +        to a tablet at a time.

Do clients only receive events if they bound the object? If not, once
a client bound it, do the others still receive events? That is, does
it work like the evdev-grab thingy?

You might also want to add a comment that "binding" is sent as
response and wl_display.sync may be used as barrier.

> +      </description>
> +      <arg name="surface" type="object" interface="wl_surface" />
> +    </request>
> +
> +    <enum name="bind_status">
> +      <description summary="notify of bind success"/>
> +      <entry name="failed" value="0" />
> +      <entry name="success" value="1" />
> +    </enum>
> +
> +    <event name="binding">
> +      <description summary="bind notification"/>
> +      <arg name="status" type="uint"/>
> +    </event>
> +
> +    <request name="unbind">
> +      <description summary="release the wl_tablet object">
> +        Unbind the tablet. If the client does not have the tablet bound,
> +        this request does nothing.

Nitpick: might want to add: "On object desctruction the tablet is
implicitly unbound."

> +      </description>
> +    </request>
> +
> +  </interface>
> +
>    <interface name="wl_output" version="2">
>      <description summary="compositor output region">
>        An output describes part of the compositor geometry.  The

Apart from "bind"/"unbind" this really looks to me like a generic
evdev forwarding. I wonder whether we should try to write a generic
evdev fallback first and then see whether this is really needed. A
"bind" might even make sense for the generic interface. Hmmm

Thanks
David


More information about the wayland-devel mailing list