2D antialiased graphics using OpenGL

Keith Packard keithp@keithp.com
Thu, 11 Dec 2003 16:45:31 -0800


Around 23 o'clock on Dec 11, "Martijn Sipkema" wrote:

> Then the framebuffer resolution should be increased if this is a problem...
> There already is consumer hardware that can do more than 8 bits per
> channel.

That's a great plan, but we can't simply mandate that and have it 
magically happen.  A majority of current hardware supports only 8 bits per 
component, which is (barely) sufficient when storing a non-linear image.

I know of hardware which can run at 10 bits per component; that has the 
undesired side effect of eliminating destination alpha bits for the 
beloved SRC_ALPHA_SATURATE rendering mode though.

Sometime soon, we'll have floating point pixel values which can easily be 
made linear.

> I feel that if 8 bits per channel is not good enough, then one should use
> more, but I doubt it is really _that_ bad...

This sounds more like a full-employment plan for graphics card vendors 
than a practical solution.  For 2D graphics, the reality is that geometry 
is such an insigificant part of the current user experience that minor 
errors in compositing results are not as important as reducing banding in 
continuous tone images.  We can improve the appearance of geometric 
objects by taking advantage of hardware accelerated gamma-correct 
compositing, or by doing that operation in software (Jim Blinn even 
provides nice code snippets for that...)

> I think a slightly too high gamma value is also used to compensate for
> the room lighting conditions.

Yeah, that's probably true.  Most color scientists I know would like to 
see the knobs on the front of the monitor ripped off as they frob the 
response curves of the CRT quite badly, making color calibration 
essentially impossible.

-keith