2D antialiased graphics using OpenGL

Billy Biggs vektor@dumbterm.net
Thu, 11 Dec 2003 16:02:22 -0600


Martijn Sipkema (msipkema@sipkema-digital.com):

> [...]
> > > If your CRT monitor has a gamma of 2.2 then setting the
> > > framebuffer 'gamma' to 2.2 using xgamma will give correct output
> > > of the framebuffer if the pixels in it are to be interpreted as
> > > linear, i.e.  describing the light intensity. This means that
> > > blending and smooth shading in OpenGL will work as expected.
> > 
> > Yes that would be true, but nobody would use a gamma value of 2.2
> > with xgamma, that would be ridiculous.  xgamma is a simple way of
> > correcting for monitors so that they appear more like some common
> > standard though.  For example, if I determine that my monitor is
> > more like 2.5 rather than 2.2, then I can use an xgamma value of 0.9
> > or 1.1, whichever is correct, and things work out nicely.
> 
> I think windows doesn't use a corrected framebuffer normally, I'm not
> sure though, but that doesn't mean we should not either...

  You're trying to politicize gamma correction?

> > > If you need to display an image that is in the sRGB color space on
> > > a gamma corrected framebuffer you'd need to convert it to linear
> > > RGB color space.
> > > 
> > > That's the way I understand it, please correct me if I'm wrong...
> > 
> > Sounds reasonable, but I don't think you should assume that people
> > set up their framebuffers to be linear.  That's just nonsensical.
> > All images from digital cameras, images on the web, etc, are almost
> > always corrected for a gamma of about 2.2 and assuming that only
> > more minor corrections will be done from then on in.
> 
> Well, those images can easily be converted to linear gamma, but when
> rendering in the framebuffer using OpenGL the result will only be
> correct when the framebuffer uses linear color values...
> 
> Also, IIRC most workstation _do_ set up the framebuffer to be (nearly)
> linear, i.e. somewhere between 1 and 1.5 (instead of 2.2).

  As has been mentioned before, using a linear framebuffer causes
visible banding.  Furthermore, modern graphics cards can do gamma
correct compositing and other colour interpolation operations, this is
part of DX9 (I think needed for DX9) and is in GDI+.  Finally, most
references about SGIs that I have seen use gamma values of like 1.7,
I've never heard of an OS API using less than 1.4, which is still
something that you compensate for.

  -Billy