Composite redraw speedup?

Carsten Haitzler raster at rasterman.com
Wed Feb 12 13:25:49 UTC 2020


On Wed, 12 Feb 2020 12:36:31 +0100 Egil Möller <egil at innovationgarage.no> said:

> 
> > hint: don't glXCreatePixmap() every time you render. do it only when you
> > first start compositing a window (i.e. on map) and if the window
> > resizes. ...
> I tried this at first, but couldn't get it to work under Xephyr., i.e.
> the image in the texture didn't update when the window changed. Any ideas?

Don't use xephyr for real testing of anything involving gl + compositing and
expect it to reflect how it'll work outside. :) I never even bother with GL in
Xephyr and go back to the software path of "xgetimage" fun and do all the
rendering on CPU then push it all back when done.

> > also... don't handle every damage event as a draw. accumulate them for some
> > time (until the next frame tick/draw) and then draw it all in one go.
> 
> I do this, sort of. When I get a new damage event, I check if I'm
> already drawing every frame, and if so, just wait till next frame. If
> not, I start drawing every frame for a while...
> 
> > follow the model of "i shadow copy/store all the server state locally
> > whenever i can and use what i have stored locally on my side until i am
> > told it has changed or circumstances are that what i have is invalid and
> > needs to be thrown out or updated". 
> 
> Already do this (this is what the property* files in the sources are
> for). But since I control animations by another program updating e.g. a
> coordinate/size/transform matrix properties on windows to animate, I end
> up having to read some select properties fairly often.

Don't do this. the moment you do this kind of thing "with another program" you
have the nasties of latency, scheduling wakeups/hiccups to worry about and also
now your multiple ping-pongs through the xserver. other app has to send
protocol to X to change property, you have to get event, then you have to do a
round trip to read it. It's all asking for pain. Do the control inside your
compositor and only control at a high level from elsewhere e.g. "begin this
animation sequence from state X to state Y over time Z now" and then have the
compositor drive the animation itself from there. If you absolutely must do
this from another program frame by frame, then have a direct connection e.g.
via stdin/out or a socket between that program and the compositor to avoid
bouncing through the x server.

> > just an aside. you want to make a "minimal compositing wm" and also want it
> > fast and efficient and also want to do smooth animation and so on. this is
> > why toolkits exist that have already solved these problems for you. :) they
> > have gone and implemented these select based main loops and have glued in
> > xlib for you and have implemented all the pixmap compositing glue to do it
> > efficiently and have handled "only redraw the regions that update" and
> > glued in vsync if available (or otherwise use the clock-based select
> > timeouts) for animation. as you do more and more you will re-implement more
> > and more of this. you might want to look at how toolkits do this or just
> > use one. :)
> 
> Which toolkits for making compositors are there? I haven't even heard of
> one... I've tried to collect the blog posts and documentation I've found
> on the subject here:
> https://github.com/redhog/InfiniteGlass/blob/master/docs/REFERENCES.md -
> it hasn't exactly been the easiest info to find compared to other API:s
> and problem spaces in Linux userland development....

EFL was written to do just that. It's the libraries behind enlightenment. It
also allows building of applications like GTK+ and friends. It's basically large
chunks of E being put into shared libs as opposed to the compositor itself. Of
course the WM has a lot of code too but the stuff that is more generally useful
is in the libraries.

> I did think about using the Gtk/GObject mainloop for a while, since I
> already drag in rsvg, but decided so far that the overhead in
> assumptions compared to smaller code wouldn't be worth it. But it
> doesn't do the composition stuff from what I know, just ordinary window
> event management, timeouts etc.

EFL does. Evas (the canvas) does all the rendering (either in software or GL)
and handles "native surfaces" which are its term of adopting things like
pixmaps into the scene graph as objects. It glues in the EGL or GLX pixmaps for
you, partial update (buffer age...) etc. It may not be your cup of tea, but it
does these kinds of things.

Unfortunately it's big and the sample of "how do I write a WM/compositor using
it" is "look at Enlightenment" which is absolutely not small or simple.

> Well... minimal and minimal. I guess I've given up a bit on the code
> being so minimal :( I have however tried as much as possible to not put

You've accepted reality at least :) You're getting into a rabbit-hole of "It'll
grow and grow and the more you try and polish it up the more it will grow". :)

> any policy stuff in the renderer, but instead separate that all out to a
> python program that handles keyboard/mouse bindings, since that's much
> lower bandwidth, but is nice if it's easier to change for me/users.

My suggestion above - have that not drive animation in detail but at a higher
level with source/destination geometry/state info and timelines and some kind
of transition/interpolation path (linear, ease in/out or just provide a
parameterized curve etc.) then let the compositor "run it" once told what it
needs to do. Of course allow that to be interrupted and some new change
initiated as needed by input etc.

-- 
------------- Codito, ergo sum - "I code, therefore I am" --------------
Carsten Haitzler - raster at rasterman.com



More information about the xorg-devel mailing list