RFC: multitouch support v2
Kristian Høgsberg
krh at bitplanet.net
Thu Dec 22 07:53:30 PST 2011
2011/12/22 Chase Douglas <chase.douglas at canonical.com>:
> On 12/22/2011 07:15 AM, Kristian Høgsberg wrote:
>> On Thu, Dec 22, 2011 at 1:45 AM, Chase Douglas
>> <chase.douglas at canonical.com> wrote:
>>> On 12/21/2011 09:34 AM, Tiago Vignatti wrote:
>>>> From: Tiago Vignatti <tiago.vignatti at intel.com>
>>>>
>>>> Hi,
>>>>
>>>> Following Kristian suggestions, I updated the patchset with the following:
>>>> - driver now accumulates input coordinates to send along touch_down
>>>> - updated the protocol touch_down event with surface field, meaning the focus
>>>> surface of a touch device
>>>> - compositor now uses a touch_focus pointer (self explicative), where it's
>>>> picked when the first finger is down; all further events go there until it
>>>> gets released
>>>> - not doing pointer emulation for now; that will come next.
>>>
>>> Do we really want pointer emulation in the window server? I can tell you
>>> from first-hand experience it's a nightmare. Toolkits should be updated
>>> to handle touch events properly, with an option to receive touch events
>>> and emulate pointer events for applications that aren't ready for touch
>>> event handling.
>>
>> I don't think we do. I'm not 100% sure yet, which is why I want to
>> focus on just the basic touch events for now. I agree that since you
>> have to port a toolkit to Wayland anyway, you can just do pointer
>> emulation (if you must, real touch support is better, of course) in
>> the toolkit when you port it.
>>
>> The one thing that makes me not quite sure is that client-side pointer
>> emulation won't be able to move the pointer sprite in response to
>> touch point 0 moving. And maybe we don't need that. On the other
>> hand, if the toolkit synthesizes enter/leave events in response to
>> touch events, it's going to be confusing when the actual pointer
>> enters a different surface. It's also possible to make server-side
>> pointer emulation a per-client thing, similar to what Peter did for X.
>> If a client subscribes to touch events, we don't do pointer
>> emulation.
>
> There's a niggle there. If a client selects for touch and pointer
> events, it will only receive touch events. However, if a client grabs
> touch and pointer events through a passive grab, the touch grab is
> handled first, and then the pointer grab second if the touch grab is
> rejected.
>
> I don't know wayland's protocol yet, but shouldn't enter/leave events
> have some kind of device identifier in them? I would think that should
> alleviate any client-side confusion.
I don't think so. To be clear, the problem I'm thinking of is where
the toolkit does select for touch events, but only to do client side
pointer emulation in the toolkit. What should a client do in case the
pointer is hovering over a button in one window, when it then receives
a touch down in another window? The toolkit only maintains one
pointer focus (which is current in that other window), and what
happens when you receive touch events in a different window? What
kind of pointer events do you synthesize? We can't move the system
pointer to match the touch position.
I guess you could synthesize a leave event for the window the pointer
is in but remember the window and position. Then synthesize an enter
event for the window with the touch event and send button down and
motion events etc. Then when the touch session is over (all touch
points up), the toolkit synthesizes an enter event for the window and
position the pointer is actually in.
Kristian
More information about the wayland-devel
mailing list