[Spice-devel] [Qemu-devel] paravirtual mouse/tablet

Gerd Hoffmann kraxel at redhat.com
Thu Jan 20 03:54:07 PST 2011


On 01/20/11 07:25, Peter Hutterer wrote:
> Hi guys,
>
> I apologize for replying this way, I wasn't on the spice-list, jrb pointed
> out this thread to me. For those who don't know me, I'm the input maintainer
> for X.Org. I also know very little about spice, so please take the comments
> below accordingly. Comments regarding a few things that showed up in this
> thread, slightly out of order because I read through the web-interface:

Thanks four your input.  Think I'll start with some background 
information ...

Mouse handling in virtual machines is a bit tricky.  The guests display 
is usually some window on the host machine.  The mouse cursor is at some 
position within that window, and it works best when we can pass through 
that absolute position to the guest.  The device is something which 
doesn't exist as real hardware.  It is like a mouse, but with absolute 
instead of relative coordinates.  Or like a tablet without a pen.

Usually a virtual usb tablet device is used to handle this.  Problem 
with this is that usb emulation is quite cpu expensive for hardware 
design reasons, so we are looking for a better way.  The idea us to use 
a virtio-serial channel, which is basically a bidirectional stream (like 
a unix socket) between host and guest and run some to-be-designed 
protocol there.

The spice-specific issue here is that spice supports multihead, i.e. you 
have two displays in the guest and two windows on the host, and mouse 
positions are reported as (x,y,window).  Question is how to handle this 
best ...

> = Multiple Devices =
> X has supported multiple independent pointer/keyboard devices. Not just
> physical devices, multiple cursors since late 2009 and it's now standard on
> any up-to-date linux distribution (including RHEL6). Not having device
> identifiers in the core protocol was one of our greatest regrets with input
> handling.
>
> X abstracts into so-called master devices and slave devices. The default
> setup is to have all slave devices (==physical devices) send events through
> the first master device pair (first cursor and first keyboard focus). This
> can be reassigned at runtime, so one can create a second cursor and assign
> any device to this. The cursors can be used simultaneously. Same for
> keyboards, including gimmicks such as different layouts.

How can I configure this btw?

> = Mouse Wheel =
> Mouse wheel is buttons 4,5,6,7 in X by convention, but that is not true in
> other systems.

Hmm, isn't that the case even for (at least some) hardware?  With both 
of my wheel mouses the wheel moves forward in steps ...

> Also, there's devices that don't use PS/2 but provide scrollwheels (Wacom
> ArtPen for example). These devices have higher resolution scrollwheels that
> cannot map easily into buttons without getting awkward UI effects.

Yea, I see, using buttons to signal that certainly don't work very well.

> Likewise, touchpads these days have pressure, tablets have multiple axes
> such as pressure, distance, tilt, and rotation. Two axes is not enough for
> anything but a standard mouse.

Ok.

> labelling. Having said that, I don't think 32 buttons are enough.
> Why not just send the button state as it changes?

x events carry both the button pressend/released and the mask of 
currently pressed buttons, which I tried to mimic.  The mask is 
convenient although redundant.  Removing it will kill the 32 buttons 
limit ;)

How do you label the buttons?  Is there a enum?  Or simply strings?

> Note that there's devices that are both pointer and multitouch devices
> (Apple's MagicMouse and MagicTrackpad).

Ok, I understand that for the mouse.  But for the pad?  Isn't there just 
a surface to touch and nothing else?  Does the device behave differently 
depending on how many fingers it detects on the surface?

> = Pressure =
> Pressure is not a binary state. All devices that I've had to deal with so
> far have a pressure range (usually client-configured) and then events happen
> based on passing that threshold (+ a tolerance range, if applicable).

Ok.  I think for the virtual hardware it is just fine to report the 
pressure to the guest and let the guest handle the interpretion of the 
data (i.e. should that be a mouse click or not depending on the 
threshold and maybe other data).

> This goes as far as auto-adjusting the threshold to accommodate worn styli
> tips as we do in the wacom driver.
>
> It's not quite as simple as Alex wrote:
> "Touching means:
>      Touchpad: movement of cursor
>      Tablet: pressing down a pen"

I don't see any reason to make a difference between a tablet and a 
touchpad (from the virtual hardware point of view).  Just passing on the 
information we have:  position, pressure, ...

The guest may process the data in different ways of course.

> We use the tool type identifier for this as there are devices that are
> _both_ touch and tablet devices (e.g. ISDV4 serial tablets).

i.e. the device can figure whenever you used the finger or the pen?

> I think that's enough for now. If someone is happy to fill me in on the
> details of what exactly you're trying to do, I'm happy to help with the
> architectural decisions.

cheers,
   Gerd



More information about the Spice-devel mailing list