[PATCH weston 21/25] protocol: add weston_touch_calibration

Pekka Paalanen ppaalanen at gmail.com
Tue Apr 10 11:01:10 UTC 2018


On Mon, 9 Apr 2018 12:54:43 +1000
Peter Hutterer <peter.hutterer at who-t.net> wrote:

> On Fri, Mar 23, 2018 at 02:01:01PM +0200, Pekka Paalanen wrote:
> > From: Pekka Paalanen <pekka.paalanen at collabora.co.uk>
> > 
> > This is a Wayland protocol extension to allow the calibration of
> > touchscreens in Weston.
> > 
> > See: https://phabricator.freedesktop.org/T7868
> > 
> > Signed-off-by: Pekka Paalanen <pekka.paalanen at collabora.co.uk>
> > ---
> >  Makefile.am                           |   5 +-
> >  protocol/weston-touch-calibration.xml | 320 ++++++++++++++++++++++++++++++++++
> >  2 files changed, 324 insertions(+), 1 deletion(-)
> >  create mode 100644 protocol/weston-touch-calibration.xml

> > +  <interface name="weston_touch_calibration" version="1">
> > +    <description summary="weston touchscreen calibration interface">
> > +      This is the global interface for calibrating a touchscreen input
> > +      coordinate transformation. It is recommended to make this interface
> > +      privileged.
> > +
> > +      This interface can be used by a client to show a calibration pattern and
> > +      receive uncalibrated touch coordinates, facilitating the computation of
> > +      a calibration transformation that will align actual touch positions
> > +      on screen with their expected coordinates.
> > +
> > +      Immediately after being bound by a client, the server sends the
> > +      touch_device events.  
> 
> s/server/compositor/, in a few more places

I'm a bit mixed there. I tried to be consistent with "server", but one
"compositor" still remained. There are a couple dozen "server".

Is your point that all Wayland spec language should be using
"compositor" to talk about the server-side?


> > +    <request name="create_calibrator">
> > +      <description summary="give the calibrator role to a surface">
> > +	This gives the calibrator role to the surface and ties it with the
> > +	given touch input device.
> > +
> > +	The surface must not already have a role, otherwise invalid_surface
> > +	error is raised.
> > +
> > +	The device string must be one advertised with touch_device event's
> > +	device argument, otherwise invalid_device error is raised.
> > +
> > +	There must not exist a weston_touch_calibrator protocol object in the
> > +	server already, otherwise already_exists error is raised. This
> > +	limitation is server-wide and not specific to any particular client.  
> 
> Personal preference: I find it easier to understand when a sentence is "if
> blah then error E". As opposed to "if not blah, otherwise E". It also
> narrows down the error cases a bit better too because you're only describing
> the error case rather than the not-error case.

Yup.

> > +      </description>
> > +      <arg name="surface" type="object" interface="wl_surface"
> > +           summary="the surface to give the role to"/>
> > +      <arg name="device" type="string" summary="the touch device to calibrate"/>
> > +      <arg name="cal" type="new_id" interface="weston_touch_calibrator"
> > +           summary="a new calibrator object"/>
> > +    </request>
> > +
> > +    <request name="save">
> > +      <description summary="save calibration for a touch device">
> > +	This request asks the server to save the calibration data for the
> > +	given touch input device. The server may ignore this request.
> > +
> > +	The device string must be one advertised with touch_device event's
> > +	device argument, otherwise invalid_device error is raised.
> > +
> > +	The array must contain exactly six 'float' (the 32-bit floating
> > +	point format used by the C language on the system) numbers. The numbers
> > +	form a libinput style 2-by-3 calibration matrix in row-major order.  
> 
> 'libinput-style', i.e. hyphenated. but tbh, no need to mention libinput
> here, just spell out the matrix:
> """
>  For a 3D matrix in the form
>    ( a b c )
>    ( d e f )
>    ( 0 0 1 )
>  the array must contain the values [a, b, c, d, e, f] with a at index 0.
> """

Not only that, but it also needs to define the coordinate spaces where
it applies, and probably also the ordering with other transformations
as well. Hence I just punted to "how libinput uses it", because
everything here is specifically designed for libinput.

I'm also not sure I can actually lay out a matrix so that it will still
look ok in generated output and doxygen docs generated from that.

> Much easier than figuring out what "row-major order" means.

It's a standard term.

> > +      </description>
> > +      <arg name="device" type="string" summary="the target touch device"/>
> > +      <arg name="matrix" type="array" summary="the new calibration matrix"/>
> > +    </request>
> > +
> > +    <event name="touch_device">
> > +      <description summary="advertise a touchscreen input device">
> > +	When a client binds to weston_touch_calibration, one touch_device
> > +	event is sent for each touchscreen that is available to be calibrated.
> > +	These events are not sent later even if new touch devices appear.  
> 
> Unclear what "later" means, use "Touch devices added after the initial burst
> of events will not generate a touch_device event".

More like:

Touch devices added in the server will not generate touch_device events
for existing weston_touch_calibration objects.

Or simply:

This is the only time when touch_device events are sent.

> Though arguably - why not? Because it makes things easier to implement?

Yes. There is no event to remove a touch device, and there is no need
for a client to maintain an accurate list of touch devices.

Since there is no event to remove a touch device, there also is no
protocol race between server removing and the client using one.

There is still the race of server internally removing a device and a
client starting to use it after touch_device event, but that could
be fixed by having the server remember what devices it advertised and
if the client uses a deleted device, just ignore the saving or
immediately cancel the calibration. This is not implemented in this
patch series, though.

> > +
> > +	An event carries the touch device identification and the associated
> > +	output or head (display connector) name.
> > +
> > +	On platforms using udev, the device identification is the udev DEVPATH.  
> 
> do we want to really set this in stone in the protocol? Also, the DEVPATH
> is without /sys, right? Why not make this an open format and let the client
> figure it out. It's not hard to strcmp for /sys or /dev/input/event in the
> client and iirc we have precedence for that in the tablet protocol. or in
> libinput for tablets, can't remember now :)

Yes, I wanted to have a clear, unambgious definition. What udev calls
the DEVPATH is the path without /sys. We can revise this decision
if there ever arises a need for anything else, but it's still a Weston
private protocol for now.

When would one use /dev paths?

Btw. are device node and DEVPATH equally racy wrt. device getting
replaced with a different one?

We could have rules like if it starts with /sys, remove /sys and you
have the DEVPATH; if it starts with /dev, it's the device node, and so
on, but I don't see the point. So rather than pretending that I cater
for other environments, I just rule them out for now.

> > +      </description>
> > +      <arg name="device" type="string"
> > +           summary="the touch device identification"/>
> > +      <arg name="head" type="string"
> > +           summary="name of the head or display connector"/>
> > +    </event>
> > +  </interface>
> > +
> > +  <interface name="weston_touch_calibrator" version="1">
> > +    <description summary="calibrator surface for a specific touch device">
> > +      On creation, this object is tied to a specific touch device. The
> > +      server sends a configure event which the client must obey with the
> > +      associated wl_surface.
> > +
> > +      Once the client has committed content to the surface, the server can
> > +      grab the touch input device, prevent it from emitting normal touch events,
> > +      show the surface on the correct output, and relay input events from the
> > +      touch device via this protocol object.
> > +
> > +      Touch events from other touch devices than the one tied to this object
> > +      must generate wrong_touch events on at least touch-down and must not
> > +      generate normal or calibration touch events.
> > +
> > +      At any time, the server can choose to cancel the calibration procedure by
> > +      sending the cancel_calibration event. This should also be used if the
> > +      touch device disappears or anything else prevents the calibration from
> > +      continuing on the server side.
> > +
> > +      If the wl_surface is destroyed, the server must cancel the calibration.
> > +
> > +      The touch event coordinates and conversion results are delivered in
> > +      calibration units. Calibration units are in the closed interval
> > +      [0.0, 1.0] mapped into 32-bit unsigned integers. An integer can be  
> 
> should probably add what 0.0 and 1.0 represent on each axis, i.e. nominal
> width and height of the device's sensor. Or is this something we need to
> hide?

I'm not sure. The underlying assumption is that there is a finite and
closed range of values a sensor can output, and that is mapped to [0,
1]. I don't know what nominal width and height are or can the values
extend to negative. It might be easier to define the calibration units
without defining those first.

It doesn't really matter, the important point is that when the
resulting matrix is loaded into libinput and then libinput is used
according to its API, the result is as expected. I'd be tempted to just
punt all this to libinput.

Libinput doc for libinput_device_config_calibration_set_matrix() says:

	"The translation component (c, f) is expected to be normalized
	to the device coordinate range."

I suppose rephrasing with "the device coordinate range" would be ok?
But is "device coordinates" well defined here?

How about adding:

The calibration units cover the device coordinate range exactly.



> > +      converted into a real value by dividing by 2^32-1. A calibration matrix
> > +      must be computed from the [0.0, 1.0] real values, but the matrix elements
> > +      do not need to fall into that range.
> > +    </description>
> > +
> > +    <enum name="error">
> > +      <description summary="calibrator object errors"/>
> > +      <entry name="bad_size" value="0"
> > +             summary="surface size does not match"/>
> > +      <entry name="not_mapped" value="0"
> > +             summary="requested operation is not possible without mapping the surface"/>
> > +    </enum>
> > +
> > +    <request name="destroy" type="destructor">
> > +      <description summary="destroy the calibrator">
> > +	This unmaps the surface if it was mapped. The input device grab
> > +	is dropped, if it was present. The surface loses its role as a
> > +	calibrator.
> > +      </description>
> > +    </request>
> > +
> > +    <request name="convert">
> > +      <description summary="convert from surface to raw coordinates">
> > +	This request asks the server to convert the surface-local coordinates
> > +	into the expected touch input coordinates appropriate for the
> > +	associated touch device.
> > +
> > +	The coordinates given as arguments to this request are relative to
> > +	the associated wl_surface.
> > +
> > +	If a client asks for conversion before it has committed valid
> > +	content to the wl_surface, the not_mapped error is raised.
> > +
> > +	If the server has cancelled the calibration, the conversion result
> > +	shall be zeroes.
> > +
> > +	The intention is that a client uses this request to convert marker
> > +	positions that the user is supposed to touch during calibration.
> > +      </description>
> > +      <arg name="x" type="int" summary="surface-local X coordinate"/>
> > +      <arg name="y" type="int" summary="surface-local Y coordinate"/>
> > +      <arg name="reply" type="new_id" interface="weston_touch_coordinate"
> > +           summary="object delivering the result"/>
> > +    </request>  
> 
> in theory, you could use this to figure out where on the screen you are
> which is something we've avoided elsewhere. Since it's very confined I'm not
> sure you could do anything with that information but is this something we
> need to worry about?

I wouldn't worry about it. The global interface is intended to be
privileged anyway, as it practically allows a client to grab a physical
touch device and overwrite calibration matrices. One can retrieve [0,
1] domain coordinates of the output where the calibration window
happens to be on, so you can find out where the calibration window is,
but then again, the calibration window is quite likely fullscreen on
the output you chose in the touch device list. You cannot transfer the
coordinate mapping to any other window either.

So if you get far enough to actually use this request, you already have
a window covering a specific output and likely being top-most. Even
still, if the compositor does something like 1.5 output scaling factor,
you still probably cannot transform your coordinates into physical
output coordinates. Global compositor coordinate system is not exposed
at all as these are per-output coordinates.

> > +
> > +    <event name="configure">
> > +      <description summary="surface size">
> > +	This event tells the client what size to make the surface. The client
> > +	must obey the size exactly on the next commit with a wl_buffer.
> > +
> > +	This event shall be sent once as a response to creating a
> > +	weston_touch_calibrator object.
> > +      </description>
> > +      <arg name="width" type="int" summary="surface width"/>
> > +      <arg name="height" type="int" summary="surface height"/>
> > +    </event>
> > +
> > +    <event name="cancel_calibration">
> > +      <description summary="cancel the calibration procedure">
> > +	This is sent when the server wants to cancel the calibration and
> > +	drop the touch device grab. The server unmaps the surface, if
> > +	it was mapped.
> > +
> > +	The weston_touch_calibrator object will not send any more events. The
> > +	client should destroy it.
> > +      </description>
> > +    </event>
> > +
> > +    <event name="wrong_touch">  
> 
> "invalid_device_touch" maybe? 'wrong' feels.... wrong :)
> also, while explained in the description, wrong_touch could also refer to a
> touch that's outside the boundary, at the wrong time, etc. All these things
> one could assume if one doesn't read the docs.

Yeah, this is for any touch event (most likely touch-down event) that
cannot be used for the on-going calibration. "unusable_touch"?
"bad_touch"? :-D
"incorrect_touch"?

> > +      <description summary="user touched a wrong touchscreen">
> > +	Calibration can only be done on one touch input device at a time. If
> > +	the user touches another touch device during calibration, the server
> > +	sends this event to notify the calibration client. The client should
> > +	show feedback to the user that he touched a wrong screen. This is
> > +	useful particularly when it is impossible for a user to tell which
> > +	screen he should touch (when the screens are cloned).
> > +      </description>
> > +    </event>

Thanks,
pq
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20180410/e69acdd7/attachment-0001.sig>


More information about the wayland-devel mailing list