RFC: multitouch support v2
Chase Douglas
chase.douglas at canonical.com
Thu Dec 22 11:40:51 PST 2011
On 12/22/2011 08:59 AM, Kristian Høgsberg wrote:
> 2011/12/22 Chase Douglas <chase.douglas at canonical.com>:
>> On 12/22/2011 07:53 AM, Kristian Høgsberg wrote:
>>> 2011/12/22 Chase Douglas <chase.douglas at canonical.com>:
>>>> I don't know wayland's protocol yet, but shouldn't enter/leave events
>>>> have some kind of device identifier in them? I would think that should
>>>> alleviate any client-side confusion.
>>>
>>> I don't think so. To be clear, the problem I'm thinking of is where
>>> the toolkit does select for touch events, but only to do client side
>>> pointer emulation in the toolkit. What should a client do in case the
>>> pointer is hovering over a button in one window, when it then receives
>>> a touch down in another window? The toolkit only maintains one
>>> pointer focus (which is current in that other window), and what
>>> happens when you receive touch events in a different window? What
>>> kind of pointer events do you synthesize? We can't move the system
>>> pointer to match the touch position.
>>
>> In X we move the cursor sprite to the first touch location, always. This
>> is because you have moved the master pointer, so the sprite needs to be
>> in sync with the master pointer location.
>
> How do you move the sprite without doing pointer emulation? If the
> sprite enters a window, you have to send enter/leave events, and
> motion events as it moves around.
I believe we do send enter/leave events for pointer emulated touch
events in all cases.
> When I say that I don't know if we
> need pointer emulation, I mean that there is no sprite associated with
> the touch events, there are no enter/leave events or buttons events.
> When you touch a surface, you only get a touch_down event, then
> touch_motion and then touch_up.
Sounds right for Wayland.
>> Off the top of my head, I would think Wayland should automatically
>> create the equivalent of X master pointer devices for each touchscreen
>> device. There shouldn't be a sprite for touchscreens, though the WM
>> could do fancy effects like MS Surface if you wanted it to.
>
> Right... in the MPX sense, right? So you could have a keyboard and
> mouse combo controlling one pointer/kb focus and the touch screen
> being its own master device. Then maybe you could have one person
> using the touch screen UI, and another person using the kb/mouse
> combo. That's kind of far fetched, of course, but I think the main
> point is that there's no inherent association between a kb/mouse combo
> and a touch screen. On the other hand, what about a setup with two
> mouse/kb combos (master devices) and a touch screen... you'd expect
> tapping a window on the touch screen to set kb focus, but if you have
> multiple master kbs, which kb focus do you set? Maybe we're just
> doomed for trying to make both pointer and direct touch interaction
> work in the same UI.
In the past I've tried to think of a good solution for this. I haven't
had enough time to come up with one yet :). I wouldn't advocate holding
up all of wayland to get this right, but I do think it needs to be
rethought from the ground up long term.
One possibility:
Have one "logical" pointing device for all relative input devices, and
one "logical" pointing device for each absolute input device. Then have
a configurable mapping of logical pointing devices and to logical
keyboard devices. The main difference between X and this approach is
essentially the default policy. In X, by default there is only one
master pointer and everything is attached to it.
BTW, when I think about solutions to this issue, the first question I
ask myself is whether it would work for a device like the MS Surface
table where there's a different person on each side. Assume the surface
is subdivided into four virtual input devices, so touching on your side
of the table is a different device than touching on someone else's side.
Then I imagine each person has their own keyboard. With one display
server, everyone should be able to control their own "plot" of screen
area on their side of the table.
The reason I think of this particular use case is because I think there
is a strong possibility of happening in the next 10 years.
-- Chase
More information about the wayland-devel
mailing list