[PATCH v2] protocol: Extend wl_touch with touchpoint shape event

Peter Hutterer peter.hutterer at who-t.net
Wed Apr 6 21:52:07 UTC 2016


On Wed, Apr 06, 2016 at 10:17:35AM -0700, Dennis Kempin wrote:
> On Tue, Apr 5, 2016 at 5:26 PM, Peter Hutterer <peter.hutterer at who-t.net> wrote:
> > On Tue, Apr 05, 2016 at 01:09:31PM -0700, Dennis Kempin wrote:
> >> This CL updates the wl_touch interface with a shape event.
> >> The shape of a touch point is not relevant for most UI
> >> applications, but allows a better experience in some cases
> >> such as drawing app.
> >>
> >> The shape event is used by the compositor to inform the client
> >> about changes in the shape of a touchpoint, which is
> >> approximated by an ellipse.
> >>
> >> The event is optional and only sent when compositor and the
> >> touch device support this type of information. The client is
> >> responsible for making a reasonable assumption about the
> >> touch shape if no shape is reported.
> >>
> >> Signed-off-by: Dennis Kempin <denniskempin at google.com>
> >> ---
> >>  protocol/wayland.xml | 46 +++++++++++++++++++++++++++++++++++++++++-----
> >>  1 file changed, 41 insertions(+), 5 deletions(-)
> >>
> >> diff --git a/protocol/wayland.xml b/protocol/wayland.xml
> >> index 8739cd3..90a2453 100644
> >> --- a/protocol/wayland.xml
> >> +++ b/protocol/wayland.xml
> >> @@ -1656,7 +1656,7 @@
> >>      </request>
> >>     </interface>
> >>
> >> -  <interface name="wl_seat" version="5">
> >> +  <interface name="wl_seat" version="6">
> >>      <description summary="group of input devices">
> >>        A seat is a group of keyboards, pointer and touch devices. This
> >>        object is published as a global during start up, or when such a
> >> @@ -1765,7 +1765,7 @@
> >>
> >>    </interface>
> >>
> >> -  <interface name="wl_pointer" version="5">
> >> +  <interface name="wl_pointer" version="6">
> >>      <description summary="pointer input device">
> >>        The wl_pointer interface represents one or more input devices,
> >>        such as mice, which control the pointer location and pointer_focus
> >> @@ -2078,7 +2078,7 @@
> >>      </event>
> >>    </interface>
> >>
> >> -  <interface name="wl_keyboard" version="5">
> >> +  <interface name="wl_keyboard" version="6">
> >>      <description summary="keyboard input device">
> >>        The wl_keyboard interface represents one or more keyboards
> >>        associated with a seat.
> >> @@ -2192,7 +2192,7 @@
> >>      </event>
> >>    </interface>
> >>
> >> -  <interface name="wl_touch" version="5">
> >> +  <interface name="wl_touch" version="6">
> >>      <description summary="touchscreen input device">
> >>        The wl_touch interface represents a touchscreen
> >>        associated with a seat.
> >> @@ -2242,7 +2242,12 @@
> >>
> >>      <event name="frame">
> >>        <description summary="end of touch frame event">
> >> - Indicates the end of a contact point list.
> >> +  Indicates the end of a contract point list. The wayland protocol requires
> >> +  touch point updates to be sent sequentially, however all events within a
> >> +  frame should be considered one hardware event. A wl_touch.frames terminates
> >> +  at least one event but otherwise no guarantee is provided about the set of
> >> +  events within a frame. A client must assume that any state not updated in a
> >> +  frame is unchanged from the previously known state.
> >>        </description>
> >>      </event>
> >>
> >> @@ -2262,6 +2267,37 @@
> >>      <request name="release" type="destructor" since="3">
> >>        <description summary="release the touch object"/>
> >>      </request>
> >> +
> >> +    <!-- Version 6 additions -->
> >> +
> >> +    <event name="shape" since="6">
> >> +      <description summary="update shape of touch point">
> >> +  Sent when a touchpoint has changed its shape. If the touch position changed
> >> +  at the same time, the wl_touch.motion and wl_touch.shape are sent within the
> >> +  same wl_touch.frame. Otherwise, only a wl_touch.shape is sent within this
> >> +  wl_touch.frame. The protocol does not guarantee specific ordering of
> >> +  wl_touch.shape and wl_touch.motion events.
> >> +
> >> +  A touchpoint shape is approximated by an ellipse through an orientation and
> >> +  the major and minor axis length. The major axis length describes the longest
> >> +  diameter of the ellipse, while the minor axis length describes the shortest
> >> +  diameter. Both are specified in surface coordinates. The
> >> orientation describes
> >> +  the angle between the major axis and the surface x-axis and is normalized to
> >> +  [0, 180) degrees.
> >
> > a couple of comments here:
> > should we not make this relative to the Y axis? the natural finger position
> > is closer to vertical (and thus a 0 value) than horizontal which makes
> > things easier for clients. And it's also in line with other rotation values
> > (e.g. in the tablet interface) where 0 is the logical north of the device.
> >
> > if we align it with x and have a 0-180 range our angle is counterclockwise, which
> > is different to the tablet interface. otherwise we'd have all fingers
> > pointing down :)
> >
> > I think the best solution here would be to have this normalized to a
> > -180/+180 range, clockwise, off the Y axis. This way clients can assume that
> > e.g. anything positive is a left-handed finger and anything negative is a
> > right-handed finger.
> >
> > We also need a blurb here about the granularity, since many devices only
> > have a binary rotation state (0 or 90 deg)
> I mixed up the axis in my head, sorry about that. Of course relative
> to the Y axis
> makes more sense.
> My thinking behind the 0-180 range is that for the ellipse, -90 and 90
> degrees are
> the same thing. And since most touchscreens only see a blob on the
> surface, that's
> all they can know.

just to add something here: this is an implementation problem that should
not define how the protocol looks. there are some cases where we can differ.
e.g. continuous movement from 0 -> -90 should result in -90 even when it is
ambiguous at that point.

> However I think you are right about supporting the actual orientation,
> since a stylus
> could use the same protocol and would know the actual orientation, not just the
> orientation of the ellipse.
> Also I think some newer touchscreens are able to process hovering
> fingers as well,
> which would allow them to estimate the actual finger orientation as
> well, not sure if
> they actually do that.. but it's possible.
> So I agree on measuring relative to the Y axis and using a range of -180 to 180
> (both inclusive). I will update the doc.
> 
> >
> >
> >> +  The center of the ellipse is always at the touchpoint location as reported
> >> +  by wl_touch.down or wl_touch.move.
> >> +
> >> +  This event is only sent by the compositor if the touch device supports shape
> >> +  reports. The client has to make reasonable assumptions about the shape if
> >> +  it did not receive this message.
> >
> > s/message/event/
> >
> >> +      </description>
> >> +      <arg name="id" type="int" summary="the unique ID of this touch point"/>
> >> +      <arg name="major" type="fixed" summary="length of the major
> >> axis in surface coordinates"/>
> >> +      <arg name="minor" type="fixed" summary="length of the minor
> >> axis in surface coordinates"/>
> >> +      <arg name="orientation" type="fixed"
> >> +           summary="angle between major axis and surface x-axis in degrees"/>
> >> +    </event>
> >
> > next question and this time it's about implementation: how reliable are
> > your major/minor axes? I found that the few devices I've played with the
> > major/minor is somewhere between arbitrary and wrong. I have yet to see a
> > device that is precise, or even precise enough. Andreas Pokorny sent a bunch
> > of libinput patches a while ago and IIRC that's where it stalled, at the
> > decision to make this in mm or normalized since so many devices just provide
> > garbage.
> Sadly this is true. Most devices I have seen have only one metric that roughly
> relates to the size of the touch blob. They use that value for
> ABS_MT_MAJOR/MINOR and ABS_MT_PRESSURE.
> For ChromeOS we apply a calibration that roughly converts that metric to
> square mm of contact surface, but we do have to run that calibration for
> every new device. Though some manufacturers are transitioning towards
> outputting calibrated sizes.

that's a per-hardware thing then, not per device, right? i.e. once you have
it it works for all of that generation and the user doesn't need to run it
on the specific device anymore.

> Unfortunately I do not have a good solution for this. I like the idea of having
> the option to use the detailed ellipse representation as defined in this patch.
> When using a device that properly supports the major/minor axis, this could
> give the client a lot of benefits. For ChromiumOS we could at least use this
> protocol to send major = minor = diameter of contact to make use of the
> calibrated size.
> 
> Maybe we could include a fallback event for devices that do not have
> good support?
> One that would only use a single value to describe the size of a
> touch, normalized
> to 0..1. In practice, I do not think we could put any more requirements on this
> value. There is no way for us to tell how tell if a device is using
> the full range
> (0 to reported max axis value), and in a bad case we might end up with that
> size being in the range of 0 to 0.1.
> 
> Let me know what you think about this.

I'm mostly thinking about how to enable this in libinput from a technical
perspective, and the obvious solution would be whitelisting or blacklisting
devices, combined with the calibration you use as well.

the fallback wouldn't be normalized but simply skipping it where we can't be
sure. to do that, we need to do one thing though: split up the events into
area and orientation as separate events. we can send orientation even when
the area is unknown. 

how does that sound?

Cheers,
   Peter


More information about the wayland-devel mailing list