Input and games.

Pekka Paalanen ppaalanen at gmail.com
Fri Apr 19 02:18:12 PDT 2013


Hi Todd,

I am going to reply from the Wayland protocol point of view, and what
Wayland explicitly can (and must) do for you. This is likely much lower
level than what a game programmer would like to use. How SDL or some
other higher level library exposes input is a different matter, and I
will not comment on that. We just want to make everything possible on
the Wayland protocol level.


On Thu, 18 Apr 2013 18:22:11 -0400
Todd Showalter <todd at electronjump.com> wrote:

> On Thu, Apr 18, 2013 at 5:29 PM, Jonas Kulla <nyocurio at gmail.com>
> wrote:
> 
> > What exactly do you mean by "unique requirements", can you be a
> > little bit more specific? In general I think the current consensus
> > (correct me if I'm wrong) is that using the default wayland pointer
> > and keyboard events plus Joypad support via SDL is sufficient for
> > most purposes.
> 
>     In general we can work with anything as long as we can get the
> right events and process them; it's perhaps more a matter of
> convenience.
> 
>     There are a few things of note that are somewhat specific to
> games:
> 
> Permissions
> 
>     We're often running things like mods, user-generated scripts, and
> in general lots of untrusted content, so the fewer privileges we need
> to handle things like input, the better.

I do not think we can happily let client applications open input devices
themselves, so this is clearly a thing we need to improve on. In other
words, I believe we should come up with a protocol extension where the
server opens the input devices, and either passes the file descriptor to
a client, or the server translates evdev events into Wayland protocol
events. "How" and "what" are still open questions, as is every other
detail of input devices that are not keyboards, mice, or touchscreens.

There was once some talk about "raw input event protocol", but there is
not even a sketch of it, AFAIK.

> Hooking Things Up
> 
>     This may be beyond the scope of Wayland, but at least in the past
> I've found that in particular joysticks/gamepads are a bit of a
> guessing game for the developer.  You can usually assume that the
> first stick is the first couple of axis values in the axis array, but
> after that, it's a tossup whether an analog axis is part of a stick a
> trigger, or a pressure-sensitive button.
> 
>     It would be really nice if there was some sort of configuration
> that could be read so we'd know how the player wanted these things
> mapped, and some sort of way for the player to set that configuration
> up outside the game.

Right, and whether this could be a Wayland thing or not, depends on the
above, how to handle misc input devices in general.

Keyboards already have extensive mapping capabilities. A Wayland server
sends keycodes (I forget in which space exactly) and a keymap, and
clients feed the keymap and keycodes into libxkbcommon, which
translates them into something actually useful. Maybe something similar
could be invented for game controllers? But yes, this is off-topic for
Wayland, apart from the protocol of what event codes and other data to
pass.

> Event Driven vs. Polling
> 
>     Modern gui applications tend to be event-driven, which makes
> sense; most modern desktop applications spend most of their time doing
> nothing and waiting for the user to generate input.  Games are
> different, in that they tend to be simulation-based, and things are
> happening regardless of whether the player is providing input.
> 
>     In most games, you have to poll input between simulation ticks.
> If you accept and process an input event in the middle of a simulation
> tick, your simulation will likely be internally inconsistent.  Input
> in games typically moves or changes in-game objects, and if input
> affects an object mid-update, part of the simulation tick will have
> been calculated based on the old state of the object, and the rest
> will be based on the new state.
> 
>     To deal with this on event-driven systems, games must either
> directly poll the input system, or else accumulate events and process
> them between simulation ticks.  Either works, but being able to poll
> means the game needs to do less work.

Wayland protocol in event driven. Polling does not make sense, since it
would mean a synchronous round-trip to the server, which for something
like this is just far too expensive, and easily (IMHO) worked around.

So, you have to maintain input state yourself, or by a library you use.
It could even be off-loaded to another thread.

There is also a huge advantage over polling: in an event driven design,
it is impossible to miss very fast, transient actions, which polling
would never notice. And whether you need to know if such a transient
happened, or how many times is happened, or how long time each
transient took between two game ticks, is all up to you and available.

I once heard about some hardcore gamer complaining, that in some
systems or under some conditions, probably related to the
ridiculous framerates gamers usually demand, the button sequence he hits
in a fraction of a second is not registered properly, and I was
wondering how is it possible for it to not register properly. Now I
realised a possible cause: polling.

Event driven is a little more work for the "simple" games, but it gives
you guarantees. Would you not agree?

> Input Sources & Use
> 
>     Sometimes games want desktop-style input (clicking buttons,
> entering a name with the keyboard), but often games want to treat all
> the available input data as either digital values (mouse buttons,
> keyboard keys, gamepad buttons...), constrained-axis "analog" (gamepad
> triggers, joysticks) or unconstrained axis "analog" (mouse/trackball).
>  Touch input is a bit of a special case, since it's nearly without
> context.

Is this referring to the problem of "oops, my mouse left the Quake
window when I tried to turn"? Or maybe more of "oops, the pointer hit
the monitor edge and I cannot turn any more?" I.e. absolute vs.
relative input events?

There is a relative motion events proposal for mice:
http://lists.freedesktop.org/archives/wayland-devel/2013-February/007635.html

Clients cannot warp the pointer, so there is no way to hack around it.
We need to explicitly support it.

>     Games usually care about all of:
> 
> - the state of buttons/keys -- whether they are currently down or up
> -- think WASD here
> - edge detection of buttons/keys -- trigger, release and state change
> - the value of each input axis -- joystick deflection, screen position
> of the cursor, etc
> - the delta of each input axis
> 
>     From what I've seen, SDL does not give us the button/key state
> without building a layer on top of it; we only get edge detection.
> Likewise, as far as I understand nothing does deltas.

Ah yes, deltas are the relative motion events, see above.

> Input Capture
> 
>     It would be very helpful to have an input capture mechanism that
> could be turned on and off easily; I'd like to be able to have mouse
> input captured when a game is playing, but be able to shut off the
> mouse capture if the player brings up the pause menu.  I'd also like
> it to deactivate if the game crashes, because at least in development
> that can happen a lot.

Aah, reading this the third time, I finally understood what you meant
by input capture. The URL above for the relative motion events should
be exactly this. We are more accustomed to the term "pointer grab" or
"grabbing", meaning that during the grab, all input events go to this
particular window, until the grab is ended.

> > Personally, I'd be interested in seeing joypads become first class
> > input devices on wayland (as a capability of wl_seat alongside
> > mice/keyboard etc.),
> 
>     Hear hear!
> 
> > seeing that there are already evdev drivers existing for most
> > gamepads. But I'm unfortunately lacking experience and knowledge in
> > that field, otherwise I'd give it a hacking attempt myself.
> >
> > So yeah, for now I think SDL should serve you perfectly well =)
> 
>     SDL works, but it's not ideal; SDL maintains a lot of the desktop
> impedance mismatch with games that desktop environments have without
> it.
> 
>                                           Todd.

One thing you didn't list is input latency. In Wayland, every
input event from user actions has a timestamp corresponding to when
they occurred, but the events may not be relayed to clients ASAP.
Instead, for instance Weston relays input only during the refresh
cycle, I think. That might be a problem for games wanting to minimize
input latency, since it limits input state update rate to the monitor
refresh rate.

What do you think, is it an issue?

Depending on the game and physics engine, of course, is it possible to
make use of the input event timestamps to integrate the effect of, say,
a button going down some time in the past, instead of assuming it went
down when this game tick started?

What I'm trying to ask is, are the timestamps useful at all for games,
and/or would you really need a minimum latency input event delivery
regardless of the computational and power cost?

Keeping in mind, that event based input delivery does not rely on high
update rates, like polling does, to not miss anything.

There is also one more catch with the timestamps. Their base is
arbitrary, and a client does not know which clock produces them.
Therefore they are only useful as realtive to other input event
timestamps. Would you need a way to get the current time in the input
clock to be able to use them properly?


Thanks,
pq


More information about the wayland-devel mailing list