wl_tablet specification draft

Pekka Paalanen ppaalanen at gmail.com
Mon Jun 30 03:23:56 PDT 2014


On Mon, 30 Jun 2014 09:33:15 +0300
Pekka Paalanen <ppaalanen at gmail.com> wrote:

> On Mon, 30 Jun 2014 11:08:35 +1000
> Peter Hutterer <peter.hutterer at who-t.net> wrote:
> 
> > On Sat, Jun 28, 2014 at 12:41:33PM +0300, Pekka Paalanen wrote:
> > > On Fri, 27 Jun 2014 13:04:59 -0700
> > > Bill Spitzak <spitzak at gmail.com> wrote:
> > > 
> > > > On 06/26/2014 09:38 PM, Ping Cheng wrote:
> > > > 
> > > > > With my experience, mapping whole tablet to a window or a
> > > > > specific display area is preferred. That is how mapping works on Windows
> > > > > and Mac too.
> > > > 
> > > > First this is *only* when the drawing program wants it to happen. There 
> > > > is some kind of mode switch so the user can use the pen to do things 
> > > > outside the drawing area. When the drawing program is not controlling it 
> > > > the user wants to be able to use the pen instead of the mouse for all 
> > > > mouse actions.
> > > > 
> > > > I would also love to see addressed the ability to get "square" movement 
> > > > out of the pad, and to automatically switch to "mouse mode" if the 
> > > > outputs are a significantly different shape than the tablet. Though 
> > > > Linux was by far the worst, all three systems (OS/X and Windows) fell 
> > > > down badly here, mostly by making it impossible to mode-switch between 
> > > > mouse and tablet mode, and on Windows it is impossible to change the 
> > > > scale of mouse mode. None of them would change how the mapping is done 
> > > > when outputs are added/removed. I believe "limit to one monitor" is not 
> > > > necessary and is only being provided as a work-around for the non-square 
> > > > mappings that should be avoided in a different way.
> > > > 
> > > > Even though it sounds like it is disagreeing with me, there is no need 
> > > > for "mouse emulations". Wayland clients should all be written to know 
> > > > that they could get pen events just like mouse events and to handle them 
> > > > the same if they don't want to do anything smarter with the pen.
> > > 
> > > First you said that...
> > > 
> > > > Vaguely thinking of this from a Wayland client's pov it seems like what 
> > > > should happen is this:
> > > > 
> > > > - The pen moves the seat's mouse cursor, always. If more than one cursor 
> > > > is wanted the pen should be in a different seat. The user is not 
> > > > manipulating more than one device at a time and does not want to see two 
> > > > cursors.
> > > 
> > > ...and then you said the exact opposite, plus you require the
> > > broken case where the same hardware events map to multiple
> > > completely different protocols (wl_pointer *and* tablet).
> > 
> > that's not necessarily the worst thing, provided it doesn't happen at the
> > same time. with a "mode toggle" button the compositor could switch between
> > tablet events and absolute motion on-the-fly.
> 
> That I completely agree with, but Bill did write "The pen moves the
> seat's mouse cursor, always." Always - implies that you sometimes get
> pointer and tablet events for the same action.
> 
> > This is a change how tablets currently work in Linux but aside from that
> > it's actually the easiest and cleanest to implement.
> 
> It sounds like it would work: just use wl_pointer protocol when the
> tablet controls the pointer, and tablet protocol when it is used as
> a... tablet, and let the compositor switch between the two modes. Still,
> never the both at the same time for same physical user action, right?
> 
> That way the tablet protocol could be exclusively for the "whole tablet
> maps to the client/window/app-custom region", and you can ignore the
> "tablet maps to whole/all outputs" case as that would be handled by the
> wl_pointer mode alone. How would that sound?

Some more thoughts...

Switching is just one option, but it has the benefit that if the
compositor is using wl_pointer appropriately, you can still use
toolkits that do not explicitly support tablet protocol.

Another option is to make tablet protocol completely separate, like
wl_touch (touchscreen) is. Explicit support would be required in
toolkits, but at least there would not be too difficult interaction
problems with wl_pointer et al. so it would sound safer.

Some reasons why wl_touch is completely separate from wl_pointer, is
that wl_touch is absolute while wl_pointer uses relative motion.
wl_pointer needs a cursor, wl_touch not.

I suppose tablets are usually absolute position devices, but the
question there is, what area does it cover on a desktop. Maybe it needs
two modes like have been discussed: whole-screen and
client-drawing-area.

Another issue is, is the tablet actually a tabletscreen (like a
touchscreen with a pen or tools) or just an input surface
physically separate from monitors. Maybe the need for a cursor
partially depends on that. With tabletscreen, client-drawing-area mode
might not make sense.

If you have a tabletscreen, and your tool is a set of fingers, is it
any different to a touchscreen? Could we even cover touchscreens with
the tablet protocol, and deprecate wl_touch?

One more question is, how do input event parameters get mapped for
clients. Say, your window is in a 30-degree angle and squashed, what do
you report as tool angles etc.


Thanks,
pq


More information about the wayland-devel mailing list