Input Method Support for Wayland
Kristian Høgsberg
hoegsberg at gmail.com
Thu Jun 21 14:33:22 PDT 2012
On Thu, Jun 21, 2012 at 09:52:16PM +0200, Jan Arne Petersen wrote:
> Last week, I was implementing a little bit from the Input Method System
> proposal (https://wiki.maliit.org/Wayland_Input_Method_System_Proposal). See
> my blog post:
> http://blog.jpetersen.org/2012/06/20/text-input-method-support-in-wayland/
>
> It does not contain the really interesting stuff like integration with hardware
> keyboard events, keyboard focus and wl_seat yet, but it could still serve as a
> baseline for further work on Wayland input method system integration.
I like it! I've gone ahead and committed it all, and I put a couple
of fixes on top to prevent moving/resizing/rotating of the keyboard
surface and to prevent it from getting keyboard focus itself. The
protocol is all in weston now, which implies that it's not official
and a weston internal thing at this point. We can play around with it
a bit a figure out how it should work and then move relevant parts to
wayland eventually. Mainly I think that the text_model interface
needs to go there, while the specifics of integrating the OSK can stay
a weston specific thing for a little longer.
The protocol looks good, with the caveat that I don't fully understand
all the preedit stuff. Most crucial is the text_model interface,
since that's what all apps are going to interface with. A few thoughts:
- Should text_model.activate take a surface so we can hide the OSK if
that surface disappears or we focus another surface that doesn't
have an active text_model? Maybe after a slight timeout, in case
we give focus to another surface that also needs the OSK. Or is
that what the surface arg is for in input_method.create_text_model,
and those things are just not hooked up? What about enter/leave
type events for when another surface is focused (and the text_model
deactivated by the compositors)?
- input_method.commit_string should only be available to im
providers. As it is now, any client can bind to input_method and
start injecting text to whoever has the active text_model. Could
we move it to the intput_panel interface?
- The text_model is specific to a wl_seat, isn't it? A wl_seat is
roughly (at least for the purpose of this) a keyboard focus, and if
we support multiple keyboard foci, they'd each need a text_model,
no?
- is set_micro_focus a way to indicate the cursor location in the
text? We have a simple "text_cursor_position" protocol so the
screen zoom can track the text cursor, it sounds like we should use
this request instead?
- How would it work if an OSK wants to hide itself (like a
discard/close button in the keyboard)? I guess we can just either
destroy the surface and attach a NULL buffer.
- We don't use the output arg in input_panel.set_surface
Regarding the implementation:
- Add keyboard to libexec_PROGRAMS, call it weston-keyboard and fork
it on demand from shell.c (similar to how we launch screensaver,
look for weston_client_launch). Verify that the right client binds
like we do for desktop-shell.
More information about the wayland-devel
mailing list