Multiprocess rendering in wayland - webkitgtk+
Kristian Høgsberg
hoegsberg at gmail.com
Mon Jul 15 21:48:01 PDT 2013
On Mon, Jul 08, 2013 at 09:38:26AM +0200, Jonas Ådahl wrote:
> On Mon, Jul 8, 2013 at 8:05 AM, Iago Toral <itoral at igalia.com> wrote:
> > Hi,
> >
> > I am working on porting WebKitGTK+ to Wayland and we are having some
> > difficulties figuring out the proper way to deal with the multiprocess
> > architecture introduced with WebKit2.
> >
> > In WebKit2 we have two processes that are responsible for rendering the
> > contents of a webpage. The WebProcess takes care of parsing HTML,
> > identifying the various layers that are part of that HTML (that will be
> > rendered separately) and the composition of all these layers to create the
> > final view of the page. This composition stage is done with OpenGL. Once the
> > composition is done, the other process (UIProcess) needs a way to access the
> > results of the composition and paint them on the screen.
> >
> > In X11, this is achieved by having the WebProcess render the composition
> > results to an offscreen XWindow and sharing the XWindow ID between the two
> > processes. XComposite is used to redirect the XWindow to a pixmap. The
> > pixmap is painted in the UIProcess.
> >
> > As far as we know, there is no API in Wayland to allow two different
> > processes to share a surface, so we are not sure if the architecture I
> > describe above is currently supported in Wayland.
> >
> > So I guess my questions are:
> > - Is there a way to share a surface between two processes?
> > - If not, is there a way to implement this architecture in Wayland as it is
> > now?
> > - Would it be possible/interesting to add surface sharing API to Wayland so
> > that it supports this type of architectures naturally?
>
> I proposed an extension[0] for solving this a while back, but since
> then as far as I know the general consensus has been to use nested
> compositing[1] for sharing surfaces between processes. The nested
> compositing is possible now, but if I remember correctly, it will
> require an extra draw, as there is no Wayland EGL API for directly
> providing buffers from a nested client to a surface of the host
> client.
It's actually almost identical to the sharing that WebKit2 does under
X - same amount of copies. The WebProcess will render into an
EGLSurface (just like under X where it renders to a EGLSurface for a
redirected window). On eglSwapBuffers(), the compositor is notified
that new contents is available and can bind the color buffer as a
texture, similar to how a a compositor is notified of damage for a
window under X.
There's a few important things to notice in the nested example:
- We establish a direct connection between WebProcess and UIProcess,
so it's even a little more efficient under Wayland since you don't
have to go through the X server to notify the UIProcess when the
content is changed.
- We explicity avoid setting up a listen socket and pass the fd to
nested-client (WebProcess) by leaving it open across exec and
setting the WAYLAND_SOCKET environment variable to the fd number.
wl_display_connect() in the client will look for this and use that
fd when possible instead of trying to locate a listen socket.
- We're using the same mechanism for buffer sharing that weston and
othere full compositors use (Wayland EGL and the
EGL_WL_bind_wayland_display extension). This way, we can reuse the
hw-enablement that we did to enable weston in the first place.
- nested.c is a toytoolkit client, so it uses window.c and cairo-egl;
nested-client.c is a standalone EGL/GLES2 wayland client.
- There's a lot of code in the example that is required to get it up
and run, but isn't directly related to the nested functionality;
all the toytoolkit code in nested.c and all the EGL/GLES2
boilerplate in nested-client.c. In something like WebKit2,
- There is some kind of issue in there that looks like nested only
runs at half the frame rate. I suspect we're waiting for the frame
events or sending them out too late or something - it's not that
the nested compositor model is inherently slow.
> Regarding the mentioned extension, I had a hacked up
> proof-of-concept working, but have not continued working on it
> considering that it nested compositing and added EGL API is supposed
> to be the way forward. If I have understood the situation wrong, I'd
> be happy to continue with the previously proposed protocol extension.
Your extension made a lot sense in that it allowed an application to
forward buffers to the main compositor to let it do the final
compositing (and or color conversion) and save a copy. However, in
too many cases you want to also clip or scale or otherwise further
process the content. So I think it makes sense to first go with the
nested approach to let the client (UIProcess) use the buffer as a
texture and then later explore a way to create a subsurface and attach
the buffer as contents.
Kristian
> [0] http://lists.freedesktop.org/archives/wayland-devel/2013-March/008093.html
> [1] http://cgit.freedesktop.org/wayland/weston/tree/clients/nested.c
>
> Jonas
>
> >
> > We would really love to see WebKitGTK+ fully ported to Wayland and this is
> > probably the major obstacle we have at the moment so we are really looking
> > forward to seeing the options we have available to make this happen.
> >
> > Also, notice I am adding two colleagues of mine to the CC who are not
> > subscribed to this list. If you reply, please remember to keep them in the
> > loop.
> >
> > Iago
> _______________________________________________
> wayland-devel mailing list
> wayland-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
More information about the wayland-devel
mailing list