xserver on OpenGL

Keith Packard keithp@keithp.com
Fri, 05 Dec 2003 13:57:13 -0800


Around 13 o'clock on Dec 5, Jon Smirl wrote:

> Can you simulate a framebuffer environment on X/DRI by wrapping an existing X
> API? It wouldn't matter how slow it is since the ultimate goal would be to
> remove it.

Sure, given sufficient time.  The hard part is not knowing when to fetch 
and put pixels, but in computing a reasonably small constraint on the area 
affected by operations.  With the new Damage infrastructure, that might 
even be reasonably easy to implement.  Without this optimization, 
performance of the resulting system will be abysmal as each operation 
fetches the entire destination...

But, a useless X server is not what I'm after here.  My short term goal is
an X server with very little acceleration implemented using OpenGL and the
bulk of the rendering code running with the frame buffer.  That way I can
*use* the X server while working on the OpenGL optimizations, instead of
just playing with it.

While my long term goal is to move to a complete OpenGL backend, I don't
know how long that will take, and I've got a lot of other work to do in
the meantime.  And, I need a functional X server for that.

I think the issue we keep skirting is where it will be easier to hack in 
this transient code. For me, it's obviously less work if the kludges are 
in OpenGL, while for you it's obviously easier to push work into the X 
server.  In either case, there will be some scaffolding code written that 
we expect to throw away, which seems like a waste of time to both of us.

I guess my two arguments are:

 1)	I'm not likely to start a pure OpenGL X server in the next few
	months.  I would gladly jump to help such an implementation if
	others had the time and energy to get it working well enough for
	daily use.

 2)	Mapping the frame buffer provides a system with usable performance
	immediately; we know what dumb frame buffers are capable of, and
	with only a tiny amount of acceleration they're quite usable 2D
	environments.

>From your perspective, I see two strong counter arguments:

 1)	It's a horrible, unnatural kludge (probably prohibited by law
	in many states).

 2)	An X server not using this can use *any* existing OpenGL 
	implementation, including the nVidia and ATI binary drivers
	for output.

-keith