[PATCH weston v2 1/5] protocol: Add _wl_pointer_gestures (swipe/pinch) protocol

Daniel Stone daniel at fooishbar.org
Thu Feb 18 10:11:35 UTC 2016


Hi,

On 18 February 2016 at 10:08, Jonas Ådahl <jadahl at gmail.com> wrote:
> On Thu, Feb 18, 2016 at 10:03:36AM +0000, Daniel Stone wrote:
>> On 31 July 2015 at 14:53, Carlos Garnacho <carlosg at gnome.org> wrote:
>> > On Wed, Jul 29, 2015 at 4:52 AM, Jonas Ådahl <jadahl at gmail.com> wrote:
>> >> On Thu, Jul 23, 2015 at 07:00:27PM +0200, Carlos Garnacho wrote:
>> >>> +    <request name="get_pinch_gesture">
>> >>> +      <description summary="get pinch gesture">
>> >>> +     Create a pinch gesture object. See the
>> >>> +     wl_pointer_gesture_pinch interface for details.
>> >>> +      </description>
>> >>> +      <arg name="id" type="new_id" interface="_wl_pointer_gesture_pinch"/>
>> >>> +      <arg name="pointer" type="object" interface="wl_pointer"/>
>> >>> +    </request>
>> >>> +  </interface>
>>
>> One suggestion I'd have is to register surfaces explicitly, i.e.:
>> <request name="add_surface">
>>   <description summary="register interest for gestures on a particular surface">
>>     Adds the surface to the list of surfaces which will receive
>> gesture events. Each surface may only be registered on one
>> wl_gesture_manager object.
>>   </description>
>>   <arg name="surface" type="object" interface="wl_surface"/>
>>   <enum name="gestures" description="gestures the client is interested in"/>
>> </request>
>
> Doesn't feel right and I don't really see the point. I interpret this
> extension as an adde feature to *pointer* devices, and it should follow
> pointer focus and not diverge too much from how a pointer works / emits
> events.

Yes, reading back through it, you're right.

>> >>> +  <interface name="_wl_pointer_gesture_swipe" version="1">
>> >>> +    <description summary="a swipe gesture object">
>> >>> +      A swipe gesture object notifies a client about a multi-finger swipe
>> >>> +      gesture detected on an indirect input device such as a touchpad.
>> >>> +      The gesture is usually initiated by multiple fingers moving in the
>> >>> +      same direction but once initiated the direction may change.
>> >>> +      The precise conditions of when such a gesture is detected are
>> >>> +      implementation-dependent.
>> >>> +
>> >>> +      A gesture consists of three stages: begin, update (optional) and end.
>> >>> +      There cannot be multiple simultaneous pinch or swipe gestures, how
>> >>> +      compositors prevent these situations is implementation-dependent.
>> >>
>> >> There cannot be multiple simultaneous gestures one one seat. There can
>> >> be multiple gestures but that means they are on different seats.
>> >
>> > Right, that's the idea. On multi-seat, there would be several
>> > wl_pointers, and each could be able to perform gestures individually.
>> > I reworded the docs blurb to be more specific there.
>>
>> That works for touchpads (what I assume this was designed against),
>> but breaks for multitouch touchscreens: there you could be
>> pinching/zooming/scrolling/etc in multiple areas at the same time. I'd
>> prefer to see the individual gestures (swipe/etc) come in as new_id
>> events, which would allow multiple simultaneous gestures.
>
> Gestures for touch screens etc I would assume would be done client side,
> i.e. they wouldn't use this protocol at all.
>
> Now, one could have multiple touchpads and gesture on more than one
> which I assume would be broken now, but not sure that's a reasonable
> enough use case.

OK, if this isn't intended to apply to touchscreens then I'll drop
that objection.

Cheers,
Daniel


More information about the wayland-devel mailing list