[PATCH weston 4/8] shell: Update bindings to conform to pointer axis protocol
jadahl at gmail.com
Sat Sep 29 02:31:51 PDT 2012
On Fri, Sep 28, 2012 at 9:58 PM, Scott Moreau <oreaus at gmail.com> wrote:
> On Fri, Sep 28, 2012 at 7:09 AM, Jonas Ådahl <jadahl at gmail.com> wrote:
>> On Fri, Sep 28, 2012 at 2:47 PM, Pekka Paalanen <ppaalanen at gmail.com>
>> > On Fri, 28 Sep 2012 14:30:18 +0200
>> > Jonas Ådahl <jadahl at gmail.com> wrote:
>> >> On Fri, Sep 28, 2012 at 2:13 PM, Pekka Paalanen <ppaalanen at gmail.com>
>> >> wrote:
>> >> > I have an old discrete-stepped mouse wheel, it reports:
>> >> >
>> >> > Event: time 1348834027.330811, type 2 (EV_REL), code 8 (REL_WHEEL),
>> >> > value 1
>> >> > Event: time 1348834027.330812, -------------- SYN_REPORT ------------
>> >> >
>> >> > and the other direction:
>> >> Which one did you scroll downward?
>> > Negative is down, positive is up.
>> >> > Event: time 1348834027.906825, type 2 (EV_REL), code 8 (REL_WHEEL),
>> >> > value -1
>> >> > Event: time 1348834027.906827, -------------- SYN_REPORT ------------
>> >> >
>> >> > So it reports literally the number of "steps" it rotates. In urxvt on
>> >> > X,
>> >> > each step seems to scroll 5 lines.
>> >> Ok, so to emulate the axis movement for the event it needs to move a
>> >> number of pixels per event. For the X11 compositor I made it move 10
>> >> pixels units per step, maybe can do the same with evdev. Is it worth
>> >> having discrete scroll events handled separately in another way (or
>> >> even both ways) as it could be good to be able to detect discrete
>> >> scroll movements correctly as well?
>> > Do you mean adding more protocol for discrete axis motion? I don't
>> > know. At first thought it seems just emitting motion in steps of 10 or
>> > whatever is good enough.
>> > Btw. the clarify pointer axis event -commit didn't make too much sense
>> > until I thought of touchpads.
> Yes, I was also confused by this until considering touchpads. It still does
> make sense when considering both, however.
>> On mice, the motion and wheel are
>> > inherently in different, arbitrary units. One might even argue, that
>> > for wheels, the motion is an angle instead of a length. I don't have
>> > any strong opinions here, and I don't know how existing smooth
>> > scrolling works.
>> It makes most sense on touchpads indeed, but it is more or less the
>> only coordinate space we can relate to, as I see it.
> Axis events have nothing to do with a coordinate system really.
>> The other way to
>> make it more like the traditional protocols is to have 1 "step" be 1
>> unit, and have "smooth" scroll wheels or touchpads use fractions of 1
>> to step smaller steps. This would however make the coordinate space of
>> the axis event harder to relate to a surface, and I don't think it's
> Can you please explain in more detail, why the current code needs changing,
> so that people without a touch device can understand the problem better?
Ok, so what I'm trying to do is to enable what people call "smooth
scrolling" on an input level, meaning that scrolling is not based on
discrete arbitrary "steps" but on a more fluid motion. These types of
events makes most sense for certain types of step-less scroll wheels
and touchpads and I'll try to explain why.
When axis events are discrete steps, there is indeed little need to
relate to any kind of coordinate space except knowing what is "up" and
what is "down". A step can only be 1 or -1, thats it. This is how it
traditionally works in X11 (except XI2 I think supports non-discrete
If one wants to have axis events that more resemble smooth motions,
such as the ones emitted by those step-less scroll wheels or
touchpads, one needs to specify what the events actually mean, since
they are no longer only limited to 1 and -1. To do this, if we specify
an axis event to be a vector along an axis in a coordinate space
identical to motion events, we can create axis events that relate to
some measurement already known to both the compositor and client. A
step-less scroll wheel would transform its scroll events to a motion
vector measured in pixels and a touchpad would simply emit an axis
event as it would emit a motion event when scrolling. A client could
then read these events and can scroll its view by that amount of
pixels specified by the value parameter.
More information about the wayland-devel