wl_tablet specification draft

Peter Hutterer peter.hutterer at who-t.net
Mon Jun 30 03:46:46 PDT 2014


On 30/06/2014 20:23 , Pekka Paalanen wrote:
> On Mon, 30 Jun 2014 09:33:15 +0300
> Pekka Paalanen <ppaalanen at gmail.com> wrote:
>
>> On Mon, 30 Jun 2014 11:08:35 +1000
>> Peter Hutterer <peter.hutterer at who-t.net> wrote:
>>
>>> On Sat, Jun 28, 2014 at 12:41:33PM +0300, Pekka Paalanen wrote:
>>>> On Fri, 27 Jun 2014 13:04:59 -0700
>>>> Bill Spitzak <spitzak at gmail.com> wrote:
>>>>
>>>>> On 06/26/2014 09:38 PM, Ping Cheng wrote:
>>>>>
>>>>>> With my experience, mapping whole tablet to a window or a
>>>>>> specific display area is preferred. That is how mapping works on Windows
>>>>>> and Mac too.
>>>>>
>>>>> First this is *only* when the drawing program wants it to happen. There
>>>>> is some kind of mode switch so the user can use the pen to do things
>>>>> outside the drawing area. When the drawing program is not controlling it
>>>>> the user wants to be able to use the pen instead of the mouse for all
>>>>> mouse actions.
>>>>>
>>>>> I would also love to see addressed the ability to get "square" movement
>>>>> out of the pad, and to automatically switch to "mouse mode" if the
>>>>> outputs are a significantly different shape than the tablet. Though
>>>>> Linux was by far the worst, all three systems (OS/X and Windows) fell
>>>>> down badly here, mostly by making it impossible to mode-switch between
>>>>> mouse and tablet mode, and on Windows it is impossible to change the
>>>>> scale of mouse mode. None of them would change how the mapping is done
>>>>> when outputs are added/removed. I believe "limit to one monitor" is not
>>>>> necessary and is only being provided as a work-around for the non-square
>>>>> mappings that should be avoided in a different way.
>>>>>
>>>>> Even though it sounds like it is disagreeing with me, there is no need
>>>>> for "mouse emulations". Wayland clients should all be written to know
>>>>> that they could get pen events just like mouse events and to handle them
>>>>> the same if they don't want to do anything smarter with the pen.
>>>>
>>>> First you said that...
>>>>
>>>>> Vaguely thinking of this from a Wayland client's pov it seems like what
>>>>> should happen is this:
>>>>>
>>>>> - The pen moves the seat's mouse cursor, always. If more than one cursor
>>>>> is wanted the pen should be in a different seat. The user is not
>>>>> manipulating more than one device at a time and does not want to see two
>>>>> cursors.
>>>>
>>>> ...and then you said the exact opposite, plus you require the
>>>> broken case where the same hardware events map to multiple
>>>> completely different protocols (wl_pointer *and* tablet).
>>>
>>> that's not necessarily the worst thing, provided it doesn't happen at the
>>> same time. with a "mode toggle" button the compositor could switch between
>>> tablet events and absolute motion on-the-fly.
>>
>> That I completely agree with, but Bill did write "The pen moves the
>> seat's mouse cursor, always." Always - implies that you sometimes get
>> pointer and tablet events for the same action.
>>
>>> This is a change how tablets currently work in Linux but aside from that
>>> it's actually the easiest and cleanest to implement.
>>
>> It sounds like it would work: just use wl_pointer protocol when the
>> tablet controls the pointer, and tablet protocol when it is used as
>> a... tablet, and let the compositor switch between the two modes. Still,
>> never the both at the same time for same physical user action, right?
>>
>> That way the tablet protocol could be exclusively for the "whole tablet
>> maps to the client/window/app-custom region", and you can ignore the
>> "tablet maps to whole/all outputs" case as that would be handled by the
>> wl_pointer mode alone. How would that sound?
>
> Some more thoughts...
>
> Switching is just one option, but it has the benefit that if the
> compositor is using wl_pointer appropriately, you can still use
> toolkits that do not explicitly support tablet protocol.
>
> Another option is to make tablet protocol completely separate, like
> wl_touch (touchscreen) is. Explicit support would be required in
> toolkits, but at least there would not be too difficult interaction
> problems with wl_pointer et al. so it would sound safer.

tbh this is currently my favourite idea and that's what Lyude is working 
on atm. given that wayland is fairly young and has a bigger emphasis on 
toolkits (compared to X anyway) pushing this down to the client does 
make sense. toolkits are much more flexible and can use better 
abstractions to make tablet events look like pointer events where needed.

> Some reasons why wl_touch is completely separate from wl_pointer, is
> that wl_touch is absolute while wl_pointer uses relative motion.
> wl_pointer needs a cursor, wl_touch not.
 >
> I suppose tablets are usually absolute position devices, but the
> question there is, what area does it cover on a desktop. Maybe it needs
> two modes like have been discussed: whole-screen and
> client-drawing-area.
>
> Another issue is, is the tablet actually a tabletscreen (like a
> touchscreen with a pen or tools) or just an input surface
> physically separate from monitors. Maybe the need for a cursor
> partially depends on that. With tabletscreen, client-drawing-area mode
> might not make sense.

yep, Lyude's latest protocol draft (in the works) does/will have that in 
it, to have the built-in/display tablets not use a cursor, the external 
ones have a cursor. we haven't really worried about the client-drawing 
area mode yet because it'll likely be a request that doesn't change that 
much in the scope of the protocol other than grabbing the device for a 
single client. But you're right, having a built-in/display tablet use 
the area-mapping doesn't make sense.

> If you have a tabletscreen, and your tool is a set of fingers, is it
> any different to a touchscreen? Could we even cover touchscreens with
> the tablet protocol, and deprecate wl_touch?

hmm, interesting idea. we're also kicking around the idea where the 
protocol moves from using tablets as the event sources to using the 
tools as the sources instead. So the prox-in event doesn't come from the 
wl_tablet but from the wl_tool instead (with the tablet being a property 
on the event). That principle would be usable for touch interaction if 
you generate a couple of generic finger1, finger2, etc. tools. I'll 
think about this some more.

> One more question is, how do input event parameters get mapped for
> clients. Say, your window is in a 30-degree angle and squashed, what do
> you report as tool angles etc.

I think the safest bet is to define any tablet input as alongside the 
x/y axis of the client surface, with angles/distance defined as as 
orthogonal to the client surface. so any angles or distortion would have 
to be applied to the input before sending it to the client. Correct me 
if I'm wrong here but iirc the client doesn't know about the surface 
distortion applied by the compositor anyway, right?

Cheers,
   Peter



More information about the wayland-devel mailing list