protocol questions

Daniel Stone daniel at fooishbar.org
Sat Mar 30 04:56:13 PDT 2013


Hi,

On 30 March 2013 05:31, Matthias Clasen <matthias.clasen at gmail.com> wrote:

> Here are a few questions/observations I had while studying the protocol
> docs:
>
> - The use of serials in events seems a bit inconsistent. Most
> wl_pointer events have serials, but axis doesn't. wl_keyboard
> enter/leave events do. wl_data_offer.enter does, but the corresponding
> leave/motion events don't. Is there a rationale for this ?
>

Yes: serials are used for events which can be used to trigger other events,
e.g. setting the pointer, launching a popup, starting a drag, etc.  This is
not something you tend to do from scroll or data events.


> - Various input events have a time field. The spec doesn't really say
> anything about this. What is it good for, and what units are these -
> monotonic time ?
>

Monotonic (ideally) time in an undefined domain, i.e. they're only
meaningful on relation to each other.


> - It looks like I can't trigger a popup from a key or touch event,
> because set_popup requires a serial that corresponds to an implicit
> pointer grab. That is sad, I like the menu key...
>

Yeah, that'd be great to fix!


> - Still on popups, I don't see a way for the client to dismiss the
> popup, or is that handled by just destroying the surface ?
>

Indeed, just destroy the surface or attach a NULL buffer.


> - Buffer transformations - fun. How do these relate to each of the
> following ?
>    - resize edges
>    - transient offset
>    - buffer attach x/y
>    - input/opaque/damage regions
>    - surface x/y in motion events
>

All the latter occur on surfaces rather than buffers, so are unaffected.
 Buffer transforms are meant to support situations like where your screen
is rotated 90°, and your client can also render rotated in order to avoid
that extra blit.  So it doesn't affect the event pipeline at all, only the
display pipeline.


> - What is a wl_touch.frame event ? Weston doesn't seem to generate those...
>

It's meant to indicate a natural boundary between touch events, à la a full
EV_SYN.  So you'd send touch events for every finger down, followed by
frame, at which point you could perform gesture processing.


> - The wl_pointer interface seems to be a bit weak wrt to device
> properties. I would at least expect to learn about the number of
> buttons and right-handed vs left-handed, etc.
>

wl_pointer is an aggregation of mice, not a single mouse, so we can't
necessarily sensibly expose number of buttons.  For right vs. left-handed,
I'd expect the compositor to do the swap and clients never have to worry
about it.  If you want to expose that configuration, that should occur
through private protocol.

Cheers,
Daniel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/wayland-devel/attachments/20130330/9862d748/attachment-0001.html>


More information about the wayland-devel mailing list