XI2 pointer emulation in more detail

Daniel Drake dsd at laptop.org
Sat Nov 17 08:39:29 PST 2012


Hi,

Thanks for all the informative blog posts such as this one:
http://who-t.blogspot.com/2011/12/multitouch-in-x-pointer-emulation.html

I'm trying to get my head around pointer emulation and touch-pointer
interaction in a bit more detail. Would appreciate any guidance or
followup posts :)

This is all with xserver-1.13.0 patched with:
Sync TouchListener memory allocation with population in TouchSetupListeners()
Xi: Don't check for TOUCH_END, it's never set
Xi: don't deliver TouchEnd to a client waiting for TouchBegin (#55738)
Xi: Set modifier mask on touch events
Xi: Update the device after delivering the emulated pointer event(#56558)
remove init_event
Update the MD's position when a touch event is received
Don't use GetTouchEvents when replaying events
Don't use GetTouchEvents in EmitTouchEnd
Simplify GetTouchEvents

Following the blog post, writing a simple non-touch XI2 listening app,
quickly touching and releasing the screen generates: motion, button
press, button release. All have XIPointerEmulated set.

Now, when I make the app listen for touch events, the sequence
changes, to: Motion (with XIPointerEmulated), TouchBegin, TouchEnd.
The fact I received a Motion event seems in conflict with what is
written: "if your client selects for both touch and pointer events on
a window, you will never see the emulated pointer events."

Is this a bug? Here is the test app:
http://dev.laptop.org/~dsd/20121117/xitouch.c

I couldn't find a mention of the general idea of the event stream
changing when touch events are/aren't listened to in XI2proto.txt - am
I missing something? (If not, I'll send a patch, once my understanding
is complete)



My next area of doubt is how touchscreen interaction should interact
with the mouse cursor.

In GNOME (fallback), when I place the mouse cursor inside a window,
and then touch the screen in another part of the same window, the
touch is registered as expected but the mouse cursor stays put.

This behaviour changes when I add a second window into the mix. If I
place the mouse cursor in one window, then touch into another window,
the mouse cursor jumps to the touch point.

Is there a reason for this difference in behaviour?


In Sugar, even when touching within the same window where the mouse
pointer is, the mouse pointer always jumps. This doesn't feel like the
right behaviour for the user. The difference from GNOME may be related
to the fact that we have some grabs set up, and we listen for touches,
for our touch gesture implementation.

Furthermore, this pointer-jumping behaviour is giving us problems,
both in Sugar and GNOME. For example, position a window's "close"
button directly on top of another window with a hover-sensitive
widget. Touch-press the close button on the on-top window. The on-top
window gets closed, and the implicit move of the mouse cursor now
triggers the hover-event on the widget in the window below. This
results in a strange touchscreen user experience and a variety of
problems:
http://bugs.sugarlabs.org/ticket/4068

Any thoughts/comments much appreciated.

Thanks
Daniel


More information about the xorg mailing list