RFC: multitouch support v2
chase.douglas at canonical.com
Fri Dec 23 00:29:55 PST 2011
On 12/22/2011 06:54 PM, Alex Elsayed wrote:
> Kristian Høgsberg <krh at ...> writes:
>> Right... in the MPX sense, right? So you could have a keyboard and
>> mouse combo controlling one pointer/kb focus and the touch screen
>> being its own master device. Then maybe you could have one person
>> using the touch screen UI, and another person using the kb/mouse
>> combo. That's kind of far fetched, of course, but I think the main
>> point is that there's no inherent association between a kb/mouse combo
>> and a touch screen. On the other hand, what about a setup with two
>> mouse/kb combos (master devices) and a touch screen... you'd expect
>> tapping a window on the touch screen to set kb focus, but if you have
>> multiple master kbs, which kb focus do you set? Maybe we're just
>> doomed for trying to make both pointer and direct touch interaction
>> work in the same UI.
> One use case you seem to be forgetting is that there are mouse-type
> devices like recent Synaptics touchpads that *also* do multitouch.
> Multitouch != touchscreen. One way to solve this might be to make
> touchscreens a pointer device *with no associated keyboard device*,
> or at least none attached to actual hardware. In XInput, you can create
> a new master pair with a real pointer, but only an XTest keyboard. A
> dummy, if you will.
I don't think anyone is forgetting about indirect devices, at least I'm
not :). However, their use scenarios are a bit easier to deal with
because there's no pointer emulation.
We will also want the ability to have a touchscreen "attached" to a real
keyboard. Imagine you have a tablet with a bluetooth keyboard. Where you
touch should change the focus for the keyboard input.
More information about the wayland-devel