New approach to multitouch using DIDs and bitmasked events

Chase Douglas chase.douglas at canonical.com
Mon Jul 5 19:36:20 PDT 2010


On Tue, 2010-07-06 at 11:33 +1000, Peter Hutterer wrote:
> On Fri, Jul 02, 2010 at 09:09:48PM -0400, Chase Douglas wrote:
> > I hacked some more today, and I'm not sure we need a separate XI2 event
> > type. I'm going to send out another RFC "pull request" with the work
> > soon, but as an overview I wanted to work on the "magicmouse" scenario.
> > The result is the following:
> > 
> > MT Touchscreen:
> >   - Device is marked as a DID
> >   - Starts out floating, can't be attached
> >   - MT touch coordinates are the first two valuators, XIDeviceEvents are
> >     in screen coordinates. However, an app can grab XIRawEvents for
> >     device coordinates. *
> >   - Since it's a DID, button events are also blocked for core, XI1
> > 
> > Magic Mouse:
> >   - Device is *not* marked as a DID
> >   - Starts out attached to virtual core master pointer
> >   - First two axes are REL_X, REL_Y
> >   - Device is marked as relative (for core and XI1 events)
> >   - Abs MT axes are marked as absolute
> >     * Can only be sent through XI2 since XI can't handle mixed mode
> >   - Since Abs MT touches are not the first two axes, they are not
> >     translated to screen coordinates
> >   - Abs MT axes sent through XI2 are also sent through attached master
> >   - Button events are not blocked for core, XI1
> > 
> > I am not aware of any use case that is not covered by the above scheme
> > and requires a new XI2 event type. If you know of any, please let me
> > know.
> 
> How will a client know if a device is a DID? XIQueryDevice must return this
> information.
> 
> One of the things I do also expect from DIDs is that the _first_ touchpoint
> is converted into core events. Thus, even on a multitouch device you can
> still interact in a traditional way with apps that won't support touch
> events  Hoping for Tk and others to be updated to support a new device type
> is a bit too optimistic. Worse, it introduces a hard transition between what
> we have now and what we can, i.e. you force everyone to update or otherwise
> they lose what they have. That's not good IMO, we need a transitional
> solution.
> 
> Anyway. My idea of DIDs was the following:
> the driver creates one master pointer device for a physical touchscreen.
> This master pointer device is controlled by the first touchpoint on the
> screen. At the same time, the driver creates DIDs on-demand as it is being
> used. Each DID stands for one touchpoint and sends DirectEvents or whatever
> they're called. Obviously, that gives you a duplication but the device
> hierarchy should reference which device is a DID device, thus MT aware apps
> can just ignore pointer events from that master (or not, in the case of the
> magic mouse, possibly need some flag here).
> 
> DIDs are short-lived and don't show up in the hierarchy, only the parent
> device does, indicating that DIDs may be present. Button or key events are
> all routed through the master pointer.
> 
> This is the approach that I think could work. though we won't know until we
> write down the spec. Something I recommend for your approach as well, sit
> down with XI2proto.txt and write the additions and how clients can use them.
> More often than not, you'll likely notice some issues and have to go back to
> the code.

Ok, I'll take a look into this. I have thoughts on how the management of
the DIDs could better be done in the server rather than in the input
module. I'll try to flesh these out into code and fall back to input
module management if it won't fly.

> At the moment I'm wary of the above approach. It may work now but I'm not
> sure it's scalable.
> - what happens to the events if a client has a core grab on the magic mouse?
>   - if the pointer on the magic mouse is confined, do the touch events get
>     send to clients outside that window?

I see the magic mouse touch surface as more of a modifier to the
pointer. Touches on it do not correspond to screen coordinates, nor
anything particular on screen. It should be more like, "If I move my
finger up 5 pixels on the magic mouse, I expect to see the window under
my pointer scroll by 5 pixels." It really needs to be treated separately
from the rest of MT devices.

> - are enter/leave events sent for the touches? (one reason a new device type
>   is probably better for them)

By new device type, do you just mean an indication that they are not
traditional pointers? Or do you foresee some radical new device type so
we would start to have keyboards, pointers, and DIDs? I'm assuming the
former, but I just want to be sure.

> - XIRawEvents should not be used for anything that's not "this data came out
>   of the driver". You may have scaling, or other transformations etc.
>   applied that doesn't necessarily show up in the raw events.

I'm perfectly fine with that, I only brought it up because you wondered
about other information a client might want to know, like the touchpad
area and where a touch occurred on it. Maybe I misunderstood you?

> - on the MM device, can the core pointer events be sent to different clients
>   than the DID events? If so, how can a client ensure it gets both pointer
>   and DID events (only one client may register for ButtonPress on a window).

Going with my above view on the magic mouse, I don't see using DIDs for
the it. Passing the touches as valuators stacked on the relative pointer
valuators seems like the best approach to me.

> - Enter/Leave events are used to determine where pointer is currently. Can
>   DID events happen outside of the window defined by preceeding enter events?

If they are their own independent devices, why not?

-- Chase



More information about the xorg-devel mailing list