Multitouch followup: gesture recognition?

Florian Echtler floe at butterbrot.org
Tue Mar 23 10:42:22 PDT 2010


Hello Simon,

> > Now, in an Xorg context, I'd very much like to hear your opinions on
> > these concepts. Would it make sense to build this into an X helper
> > library?
> I'm a bit worried that discussion is inclined towards a library, as
> opposed to a composite-inspired special client. The concepts themselves
> sound fine.
Just for my understanding: when talking about a special client, you think of 
something like a (compositing) window manager?

> A library can be done 'right now', since apps are free to do so. It has
> the advantage of a close connection to the consuming app, but also the
> associated disadvantages.
> In particular, how to cope with global gestures, e.g. switching an app
> or backgrounding it? Apparently, such things should be consistent. I
> imagine a desktop environment might want to put up such a special
> client, like they have preference for their WM.
Quite correct; this is a problem my standalone library also has right
now. It's currently only supporting fullscreen clients properly.

> § A new 'gesture' event gets created, like:
[...]
> A prosaic example would be an app learning that there's a
> "DIRECTED_DRAGGING gesture going on, starting at (200, 100)@70 degrees,
> now being at (300, 100)@95 deg" and use this information to navigate
> within a 3D-view. Also note the omission of (x,y) from the general
> gesture event, since I'd deem it specific. Other gestures may not have a
> primary x,y.
I agree, this is quite similar to the way I have implemented it right now.
This applies, e.g., to a relative motion gesture which only delivers a
vector.

> § A special gesture client (composite-like)
> This client might receive events as discussed - but all of them - by
> virtue of registering with the server. It analyzes the stream, and
> whenever it thinks something important happened, it tells the server.
> The server then dispatches a corresponding gesture event, according to
> its state and some constraints given by the special client (e.g.
> Florian's event regions, gesture-specific delivery constraints, ...)
> which may not be part of the event as delivered.
What kind of events are you considering here? Could a client generate new
XI events?

> The important point here is that gesture events are asynchronous, so
> there's no need to wait inside the event loop. Gestures correlate to,
> but don't strictly depend on other input events. Their timestamps may
> not be in order for this reason.
Could you elaborate on that a bit more? I fear I'm missing some background
information here.

> I never fully worked this out, so I can't offer a fancy paper, but it
> seems sensible to me. And since anyone's free to do a library, when it
> comes down to something X.org-endorsed, a special client infrastructure
> might be a preferable choice.
I'm very interested in putting a quick hack together to try this out.
However, my knowledge about X internals is somewhat limited. Are things
like custom events possible, and how would a (special) client go about
sending them?

Many thanks for your thoughts,
Florian
-- 
0666 - Filemode of the Beast



More information about the xorg-devel mailing list