How does mouse event synthesis work on touchscreen?

Grósz Dániel groszdanielpub at
Thu Nov 19 04:11:36 UTC 2020


I'm not sure if this is the right place to ask, or where to look this up. 
Please give me a rundown (or links) on how mouse event emulation works on 

Motivation: I'd like to implement mouse event emulation for touchscreens, such 
- long tap for right click
- two-finger tap for middle click
- two-finger drag for scrolling
- tap-and-drag for dragging
- perhaps some gesture for relative rather than absolute pointer movement 
(like a touchpad) for when mode precise cursor movements are desired.

Preferably, this would entail disabling any default mouse event emulation (so 
if, for instance, a two-finger touch generates a left click by default, then I 
need to generate only a middle click, rather than both). Ideally, if an 
application specifically handles a given touch event (rather than relying on 
the default mouse click emulation), that should take 

Libinput's documentation says that (unlike on touchpads) it doesn't handle 
tapping or gestures on touchscreens, as they are expected to be implemented in 
the toolkit. This is reasonable: for instance, dragging on a touchscreen is 
sometimes equivalent to a click-and-drag operation, but in other cases it is 
conventionally used (for instance) to drag content where a mouse drag would 
select text.
However, this doesn't work well when an application doesn't have adequate 
support for touch input, which is why I'd like to have a gesture to emulate 
scrolling. Conversely, when dragging is used for scrolling, I'd like to have 
an alternative gesture that always emulates mouse drag.

Despite what libinput's docs say, there does seem to be some sort of mouse 
event emulation (only for left clicks and drags), as evidenced by the fact 
that even ancient applications such as xterm or Qt 3 apps that (I presume) 
don't have any specific code to handle touch events do react to touch.

I could imagine several ways it could work:
- always sends only touch events to applications, and it's up to the 
widget toolkit (such as Qt) to synthesize mouse events if the application 
doesn't explicitly handle touch. (This doesn't seem to be the case, as even 
ancient apps react to touch input. However, on Qt 5, QMouseEvent::source() 
returns Qt::MouseEventSynthesizedByQt on mouse events corresponding to 
touchscreen touches.)
- always sends both touch events and emulated mouse left button events. 
It's up to the application or toolkit to figure out if these belong to the 
same user action, and only handle one of them.
- sends a touch event. The application responds whether it handles it. 
If it doesn't, then sends an emulated mouse event.
- An application tells in advance whether it is touchscreen-aware or 
not. If it is, then only sends touch events, otherwise it only sends 
mouse events.
- synthesizes mouse events for some touchscreen inputs, and in that case 
it only sends them as mouse events. When there is no corresponding mouse 
event, it sends touch events. (This doesn't seem to be the case. Even 
multiple-finger touches work as left clicks in applications that don't have 
specific touchscreen support, while single taps and drags are not always 
handled the same as mouse left-button drags.)

My questions:
- Which of these (if any) is correct?
- If mouse event emulation happens somewhere in, where does it happen? 
In the touchscreen driver?
- Is what I want feasible at all? I'm afraid that preventing conflicts with 
synthesized left button events is only theoretically feasible (without 
modifying the widget toolkits) if the synthesis always happens in
- What's the best way to go about it? Modify the touchscreen driver (such as 
libinput or evdev)? Interpret touch events coming from the relevant device in 
/dev/input, and create fake mouse events with uinput? Capture touch events 
using an X client, and fake mouse events using XTEST?

Thank you for any pointers.

More information about the xorg mailing list