<html>
<head>
<base href="https://bugs.freedesktop.org/">
</head>
<body>
<p>
<div>
<b><a class="bz_bug_link
bz_status_NEW "
title="NEW - Some way to inject automated libinput events?"
href="https://bugs.freedesktop.org/show_bug.cgi?id=99800#c3">Comment # 3</a>
on <a class="bz_bug_link
bz_status_NEW "
title="NEW - Some way to inject automated libinput events?"
href="https://bugs.freedesktop.org/show_bug.cgi?id=99800">bug 99800</a>
from <span class="vcard"><a class="email" href="mailto:ryan.hendrickson@alum.mit.edu" title="rhendric <ryan.hendrickson@alum.mit.edu>"> <span class="fn">rhendric</span></a>
</span></b>
<pre>So libinput is a hardware abstraction layer, yes, and an automation source
isn't hardware. But as an abstraction layer, libinput owns those abstractions.
libinput defines what it means to be a LIBINPUT_EVENT_POINTER_AXIS, and why
that's not the same thing as an evdev REL_WHEEL or REL_HWHEEL. You might be
saying that you want libinput's interface to be effectively promoted to a
standard, so that non-libinput producers like, say, libautomation, can produce
the same events and compositors can reasonably be expected to combine the two.
But it seems to me like a bad technical solution to have a producer of
LIBINPUT_EVENT_POINTER_AXIS events and a producer of
LIBAUTOMATION_EVENT_POINTER_AXIS events and for mutual clients to be forever
anxious about or adapting to divergences between them--am I misinterpreting
your proposal?
If instead, you permit libinput's purpose to be taking diverse *input sources*
and processing them into a known set of events, it looks like the right place
to inject additional LIBINPUT_EVENT_POINTER_AXISes--yes, because all users
would get them, but also because there would be a single source of libinput
events from the perspective of the compositor, which keeps the burden of
writing compositors low, which I thought was one of the goals.
Unless, of course, you also want to make the argument that this functionality
isn't part of the core set of what users expect from a graphical desktop
environment, and that's why you want me to convince each individual compositor
to include it instead of bundling it with libinput. If that's your position,
then I don't know--sure, it isn't necessary, but with all the xdotools and
ldtps and whatever other automation tools and frameworks exist for X out there,
it seems like people want to use them, which means that if Wayland compositors
are going to take over from X, they're going to want to have a way to do that
sort of thing too... which they can implement themselves, or get from libinput.
---
Since you didn't insta-WONTFIX this (thanks!), I'll ask some technical
questions too, just in case the above sways you a little. :-)
Thanks for your comments about the backends; I agree, adding a new backend
doesn't sound like it would help. It's not quite as clear to me why creating a
new device would be messy--to the one issue you pointed out, it looks like
libinput_device_get_udev_device is already permitted to return NULL for some
devices, so are there other reasons why creating a non-udev-backed automation
device wouldn't play well with things?
About libinput events not being good enough: why would GUI testing/automation
tools have to use relative pointer events? Isn't
LIBINPUT_EVENT_POINTER_MOTION_ABSOLUTE available, or does that not work for
some reason? (For my personal needs, evdev is too low-level for the very
specific reason that I want to generate fractional POINTER_AXIS events, not
integral REL_WHEEL events, and I definitely don't want to do it by emitting a
string of ABS_MT_POSITION_Y and ABS_PRESSURE and whatever else I'd have to
provide to spoof libinput into just giving me the POINTER_AXIS I want.
Otherwise, it's fine.)</pre>
</div>
</p>
<hr>
<span>You are receiving this mail because:</span>
<ul>
<li>You are the assignee for the bug.</li>
</ul>
</body>
</html>