Proprosed break in libGL / DRI driver ABI
Brian Paul
brian.paul at tungstengraphics.com
Tue Apr 5 13:11:24 PDT 2005
Roland Mainz wrote:
> Ian Romanick wrote:
>
>>For X.org 6.9 / 7.0 I would like to break the existing libGL / DRI
>>driver interface. There is a *LOT* of crap hanging around in both libGL
>>and in the DRI drivers that exists *only* to maintain backwards
>>compatability with older versions of the interface. Since it's crap, I
>>would very much like to flush it.
>>
>>I'd like to cut this stuff out for 7.0 for several main reasons:
>>
>>- A major release is a logical time to make breaks like this.
>>
>>- Bit rot. Sure, we /assume/ libGL and the DRI drivers still actually
>>work with older versions, but how often does it actually get tested?
>>
>>- Code asthetics. Because of the backwards compatability mechanisms
>>that are in place, especially in libGL, to code can be a bit hard to
>>follow. Removing that code would, in a WAG estimate, eliminate at least
>>a couple hundred lines of code. It would also eliminate a number of
>>'#ifdef DRI_NEW_INTERFACE_ONLY' blocks.
>>
>>What I'm proposing goes a bit beyond '-DDRI_NEW_INTERFACE_ONLY=1", but
>>that is a start. In include/GL/internal/dri_interface.h (in the Mesa
>>tree) there are number of methods that get converted to 'void *' if
>>DRI_NEW_INTERFACE_ONLY is defined. I propose that we completely remove
>>them from the structures and rename some of the remaining methods. For
>>example, __DRIcontextRec::bindContext and __DRIcontextRec::bindContext2
>>would be removed, and __DRIcontextRec::bindContext3 would be renamed to
>>__DRIcontextRec::bindContext.
>>
>>Additionally, there are a few libGL-private structures in
>>src/glx/x11/glxclient.h that, due to binary compatability issues with
>>older versions of the interface, can't be change. Eliminating support
>>for those older interfaces would allow some significant cleaning in
>>those structures. Basically, all of the stuff in glxclient.h with
>>DEPRECATED in the name would be removed. Other, less important, changes
>>could also be made to __GLXcontextRec.
>
>
> Another item would be to look into what's required to support visuals
> beyond 24bit RGB (like 30bit TrueColor visuals) ... someone on IRC
> (AFAIK ajax (if I don't mix-up the nicks again :)) said that this may
> require an ABI change, too...
I doubt an ABI change would be needed for that.
> When I look at xc/extras/Mesa/src/mesa/main/config.h I see more items on
> my wishlist: Would it be possible to increase |MAX_WIDTH| and
> |MAX_HEIGHT| (and the matching texture limits of the software
> rasterizer) to 8192 to support larger displays (DMX, Xinerama and Xprint
> come in mind) ?
If you increase MAX_WIDTH/HEIGHT too far, you'll start to see
interpolation errors in triangle rasterization (the software
routines). The full explanation is long, but basically there needs to
be enough fractional bits in the GLfixed datatype to accomodate
interpolation across the full viewport width/height.
In fact, I'm not sure that we've already gone too far by setting
MAX_WIDTH/HEIGHT to 4096 while the GLfixed type only has 11 fractional
bits. I haven't heard any reports of bad triangles so far though.
But there probably aren't too many people generating 4Kx4K images.
Before increasing MAX_WIDTH/HEIGHT, someone should do an analysis of
the interpolation issues to see what side-effects might pop up.
Finally, Mesa has a number of scratch arrays that get dimensioned to
[MAX_WIDTH]. Some of those arrays/structs are rather large already.
-Brian
More information about the xorg-arch
mailing list