[PATCH weston 21/25] protocol: add weston_touch_calibration

Peter Hutterer peter.hutterer at who-t.net
Wed Apr 11 00:16:46 UTC 2018


On Tue, Apr 10, 2018 at 02:01:10PM +0300, Pekka Paalanen wrote:
> On Mon, 9 Apr 2018 12:54:43 +1000
> Peter Hutterer <peter.hutterer at who-t.net> wrote:
> 
> > On Fri, Mar 23, 2018 at 02:01:01PM +0200, Pekka Paalanen wrote:
> > > From: Pekka Paalanen <pekka.paalanen at collabora.co.uk>
> > > 
> > > This is a Wayland protocol extension to allow the calibration of
> > > touchscreens in Weston.
> > > 
> > > See: https://phabricator.freedesktop.org/T7868
> > > 
> > > Signed-off-by: Pekka Paalanen <pekka.paalanen at collabora.co.uk>
> > > ---
> > >  Makefile.am                           |   5 +-
> > >  protocol/weston-touch-calibration.xml | 320 ++++++++++++++++++++++++++++++++++
> > >  2 files changed, 324 insertions(+), 1 deletion(-)
> > >  create mode 100644 protocol/weston-touch-calibration.xml
> 
> > > +  <interface name="weston_touch_calibration" version="1">
> > > +    <description summary="weston touchscreen calibration interface">
> > > +      This is the global interface for calibrating a touchscreen input
> > > +      coordinate transformation. It is recommended to make this interface
> > > +      privileged.
> > > +
> > > +      This interface can be used by a client to show a calibration pattern and
> > > +      receive uncalibrated touch coordinates, facilitating the computation of
> > > +      a calibration transformation that will align actual touch positions
> > > +      on screen with their expected coordinates.
> > > +
> > > +      Immediately after being bound by a client, the server sends the
> > > +      touch_device events.  
> > 
> > s/server/compositor/, in a few more places
> 
> I'm a bit mixed there. I tried to be consistent with "server", but one
> "compositor" still remained. There are a couple dozen "server".
> 
> Is your point that all Wayland spec language should be using
> "compositor" to talk about the server-side?

Yeah, sorry, should've made this clear. 30 years of confusion about X'
client and server model suggests using 'compositor' and 'client' is
superior, simply because at least you know one side for sure now :)
 

> > > +    <request name="create_calibrator">
> > > +      <description summary="give the calibrator role to a surface">
> > > +	This gives the calibrator role to the surface and ties it with the
> > > +	given touch input device.
> > > +
> > > +	The surface must not already have a role, otherwise invalid_surface
> > > +	error is raised.
> > > +
> > > +	The device string must be one advertised with touch_device event's
> > > +	device argument, otherwise invalid_device error is raised.
> > > +
> > > +	There must not exist a weston_touch_calibrator protocol object in the
> > > +	server already, otherwise already_exists error is raised. This
> > > +	limitation is server-wide and not specific to any particular client.  
> > 
> > Personal preference: I find it easier to understand when a sentence is "if
> > blah then error E". As opposed to "if not blah, otherwise E". It also
> > narrows down the error cases a bit better too because you're only describing
> > the error case rather than the not-error case.
> 
> Yup.
> 
> > > +      </description>
> > > +      <arg name="surface" type="object" interface="wl_surface"
> > > +           summary="the surface to give the role to"/>
> > > +      <arg name="device" type="string" summary="the touch device to calibrate"/>
> > > +      <arg name="cal" type="new_id" interface="weston_touch_calibrator"
> > > +           summary="a new calibrator object"/>
> > > +    </request>
> > > +
> > > +    <request name="save">
> > > +      <description summary="save calibration for a touch device">
> > > +	This request asks the server to save the calibration data for the
> > > +	given touch input device. The server may ignore this request.
> > > +
> > > +	The device string must be one advertised with touch_device event's
> > > +	device argument, otherwise invalid_device error is raised.
> > > +
> > > +	The array must contain exactly six 'float' (the 32-bit floating
> > > +	point format used by the C language on the system) numbers. The numbers
> > > +	form a libinput style 2-by-3 calibration matrix in row-major order.  
> > 
> > 'libinput-style', i.e. hyphenated. but tbh, no need to mention libinput
> > here, just spell out the matrix:
> > """
> >  For a 3D matrix in the form
> >    ( a b c )
> >    ( d e f )
> >    ( 0 0 1 )
> >  the array must contain the values [a, b, c, d, e, f] with a at index 0.
> > """
> 
> Not only that, but it also needs to define the coordinate spaces where
> it applies, and probably also the ordering with other transformations
> as well. Hence I just punted to "how libinput uses it", because
> everything here is specifically designed for libinput.
> 
> I'm also not sure I can actually lay out a matrix so that it will still
> look ok in generated output and doxygen docs generated from that.

@code and @endcode in doxygen map to <pre> (you can use the html tag
directly too). Or there's @verbatim and @endverbatim.  You can do fancy
things with latex like libinput does, but it makes the source pretty awful
to look at (and introduces extra build dependencies).

> > Much easier than figuring out what "row-major order" means.
> 
> It's a standard term.
> 
> > > +      </description>
> > > +      <arg name="device" type="string" summary="the target touch device"/>
> > > +      <arg name="matrix" type="array" summary="the new calibration matrix"/>
> > > +    </request>
> > > +
> > > +    <event name="touch_device">
> > > +      <description summary="advertise a touchscreen input device">
> > > +	When a client binds to weston_touch_calibration, one touch_device
> > > +	event is sent for each touchscreen that is available to be calibrated.
> > > +	These events are not sent later even if new touch devices appear.  
> > 
> > Unclear what "later" means, use "Touch devices added after the initial burst
> > of events will not generate a touch_device event".
> 
> More like:
> 
> Touch devices added in the server will not generate touch_device events
> for existing weston_touch_calibration objects.
> 
> Or simply:
> 
> This is the only time when touch_device events are sent.

works, but I'd even use both ("this is the only time... Touch devices added
.."). language is ambiguous, reducing ambiguity is always good.

> > Though arguably - why not? Because it makes things easier to implement?
> 
> Yes. There is no event to remove a touch device, and there is no need
> for a client to maintain an accurate list of touch devices.

wfm

> Since there is no event to remove a touch device, there also is no
> protocol race between server removing and the client using one.
> 
> There is still the race of server internally removing a device and a
> client starting to use it after touch_device event, but that could
> be fixed by having the server remember what devices it advertised and
> if the client uses a deleted device, just ignore the saving or
> immediately cancel the calibration. This is not implemented in this
> patch series, though.
> 
> > > +
> > > +	An event carries the touch device identification and the associated
> > > +	output or head (display connector) name.
> > > +
> > > +	On platforms using udev, the device identification is the udev DEVPATH.  
> > 
> > do we want to really set this in stone in the protocol? Also, the DEVPATH
> > is without /sys, right? Why not make this an open format and let the client
> > figure it out. It's not hard to strcmp for /sys or /dev/input/event in the
> > client and iirc we have precedence for that in the tablet protocol. or in
> > libinput for tablets, can't remember now :)
> 
> Yes, I wanted to have a clear, unambgious definition. What udev calls
> the DEVPATH is the path without /sys. We can revise this decision
> if there ever arises a need for anything else, but it's still a Weston
> private protocol for now.
> 
> When would one use /dev paths?

*shrug*, not sure in this case, just used that as an example here. Point was
more: prefixing with /sys makes it a lot more identifiable than just using
the DEVPATH, not least because you can immediately stat that path (or ls, important
for bug reporters).

also, udev_device_new_from_syspath() seems to error out if it doesn't start
with "/sys", so any programmatic user would have to pre-pend /sys anway.

> Btw. are device node and DEVPATH equally racy wrt. device getting
> replaced with a different one?

not 100% sure. Theory is that if you have the DEVPATH you can get the
devnode through udev and by then all races should've been resolved. If you
get the device node independently, then you're bound to run into races.

> We could have rules like if it starts with /sys, remove /sys and you
> have the DEVPATH; if it starts with /dev, it's the device node, and so
> on, but I don't see the point. So rather than pretending that I cater
> for other environments, I just rule them out for now.

sure, it's fine to just support udev. I just question the utility of DEVPATH
in particular. I mean DEVPATH=/devices/rmi4-00/input/input25/event17 on my
touchpad, but /sys/class/input/event17 gives me the same udev device. Or
/sys/dev/char/.../

> > > +      </description>
> > > +      <arg name="device" type="string"
> > > +           summary="the touch device identification"/>
> > > +      <arg name="head" type="string"
> > > +           summary="name of the head or display connector"/>
> > > +    </event>
> > > +  </interface>
> > > +
> > > +  <interface name="weston_touch_calibrator" version="1">
> > > +    <description summary="calibrator surface for a specific touch device">
> > > +      On creation, this object is tied to a specific touch device. The
> > > +      server sends a configure event which the client must obey with the
> > > +      associated wl_surface.
> > > +
> > > +      Once the client has committed content to the surface, the server can
> > > +      grab the touch input device, prevent it from emitting normal touch events,
> > > +      show the surface on the correct output, and relay input events from the
> > > +      touch device via this protocol object.
> > > +
> > > +      Touch events from other touch devices than the one tied to this object
> > > +      must generate wrong_touch events on at least touch-down and must not
> > > +      generate normal or calibration touch events.
> > > +
> > > +      At any time, the server can choose to cancel the calibration procedure by
> > > +      sending the cancel_calibration event. This should also be used if the
> > > +      touch device disappears or anything else prevents the calibration from
> > > +      continuing on the server side.
> > > +
> > > +      If the wl_surface is destroyed, the server must cancel the calibration.
> > > +
> > > +      The touch event coordinates and conversion results are delivered in
> > > +      calibration units. Calibration units are in the closed interval
> > > +      [0.0, 1.0] mapped into 32-bit unsigned integers. An integer can be  
> > 
> > should probably add what 0.0 and 1.0 represent on each axis, i.e. nominal
> > width and height of the device's sensor. Or is this something we need to
> > hide?
> 
> I'm not sure. The underlying assumption is that there is a finite and
> closed range of values a sensor can output, and that is mapped to [0,
> 1]. I don't know what nominal width and height are or can the values
> extend to negative. It might be easier to define the calibration units
> without defining those first.

sorry, I used "nominal" before I rewrote to use "sensor", probably made it
more confusing. On some devices, the sensor is larger than the screen so you
can get coordinates outside the screen area. This is not a technical
problem, just a user perception mismatch that doens't need to be exposed and
after all that's what calibration is about.

Once you apply calibration though you can get real negative coordinates,
especially if the device advertises a [n:m] range but then sends events less
than n. For example, all synaptics touchpads (pre 2014, some after that) do
that.

These devices are a problem because we rely on the coordinate range for the
size and then on the calibration on top to normalize the input. So they may
need two fixes, one for the range adjustment, one for calibration.

> It doesn't really matter, the important point is that when the
> resulting matrix is loaded into libinput and then libinput is used
> according to its API, the result is as expected. I'd be tempted to just
> punt all this to libinput.

libinput will sort-of punt it back :)
https://wayland.freedesktop.org/libinput/doc/latest/tablet-support.html#tablet-bounds

 
> Libinput doc for libinput_device_config_calibration_set_matrix() says:
> 
> 	"The translation component (c, f) is expected to be normalized
> 	to the device coordinate range."
> 
> I suppose rephrasing with "the device coordinate range" would be ok?
> But is "device coordinates" well defined here?
> 
> How about adding:
> 
> The calibration units cover the device coordinate range exactly.

yep, works for me.

> > > +
> > > +    <event name="configure">
> > > +      <description summary="surface size">
> > > +	This event tells the client what size to make the surface. The client
> > > +	must obey the size exactly on the next commit with a wl_buffer.
> > > +
> > > +	This event shall be sent once as a response to creating a
> > > +	weston_touch_calibrator object.
> > > +      </description>
> > > +      <arg name="width" type="int" summary="surface width"/>
> > > +      <arg name="height" type="int" summary="surface height"/>
> > > +    </event>
> > > +
> > > +    <event name="cancel_calibration">
> > > +      <description summary="cancel the calibration procedure">
> > > +	This is sent when the server wants to cancel the calibration and
> > > +	drop the touch device grab. The server unmaps the surface, if
> > > +	it was mapped.
> > > +
> > > +	The weston_touch_calibrator object will not send any more events. The
> > > +	client should destroy it.
> > > +      </description>
> > > +    </event>
> > > +
> > > +    <event name="wrong_touch">  
> > 
> > "invalid_device_touch" maybe? 'wrong' feels.... wrong :)
> > also, while explained in the description, wrong_touch could also refer to a
> > touch that's outside the boundary, at the wrong time, etc. All these things
> > one could assume if one doesn't read the docs.
> 
> Yeah, this is for any touch event (most likely touch-down event) that
> cannot be used for the on-going calibration. "unusable_touch"?
> "bad_touch"? :-D
> "incorrect_touch"?

"invalid_touch" then, I'd say. Simply because it only specifies that it
can't be used for whatever reason. Separating invalid from incorrect device
might be good for user interface purposes, but whether it's worth the
effort is a different matter.

Cheers,
   Peter

> > > +      <description summary="user touched a wrong touchscreen">
> > > +	Calibration can only be done on one touch input device at a time. If
> > > +	the user touches another touch device during calibration, the server
> > > +	sends this event to notify the calibration client. The client should
> > > +	show feedback to the user that he touched a wrong screen. This is
> > > +	useful particularly when it is impossible for a user to tell which
> > > +	screen he should touch (when the screens are cloned).
> > > +      </description>
> > > +    </event>
> 
> Thanks,
> pq




More information about the wayland-devel mailing list