Who is working on the non-drawing parts?
Mohamed Ikbel Boulabiar
boulabiar at gmail.com
Wed Nov 24 18:55:47 PST 2010
2010/11/24 Marty Jack <martyj19 at comcast.net>:
> I notice you are planning on using the keymapping library, which is good, but the keydefinition compiler is another thing that could stand to be reimplemented from a blank sheet of paper. I will get to that eventually. I only wrote four production quality compilers, so it should be easy if I get to break compatibility on some parts of the language.
All the input should be reimplemented from a blank sheet of paper.
X is considering every input as "a some sort" of keyboard+mouse and
this needs to change.
multimedia key are just a small part of the problem, sensors are just
input devices, many other input devices are arriving (like the kinect
for example, it would be possible to attach some code blocks to read
the bare stream and generate whatever events, etc).
Multi-touch Gestures are a form of a computed input from multi-touch events.
Better to consider the input in a more generalized and not design
everything as to be used only for keyboard/mouse only.
I think there is a need to be able to easily:
1. Add "filtering" blocks to input from the kernel which may come from
2. Map them to whatever virtual input device seen by applications.
(usual apps see the keyboard/mouse, new advanced apps can see and use
I imagine these 2 points supported by a sub layer/library of input
which wayland can use.
I really wish to hear some feedback/brainstorming for the handling of input.
More information about the wayland-devel