2D antialiased graphics using OpenGL

Martijn Sipkema msipkema@sipkema-digital.com
Thu, 11 Dec 2003 22:42:55 +0100


[...]
> > If your CRT monitor has a gamma of 2.2 then setting the framebuffer
> > 'gamma' to 2.2 using xgamma will give correct output of the
> > framebuffer if the pixels in it are to be interpreted as linear, i.e.
> > describing the light intensity. This means that blending and smooth
> > shading in OpenGL will work as expected.
> 
>   Yes that would be true, but nobody would use a gamma value of 2.2 with
> xgamma, that would be ridiculous.  xgamma is a simple way of correcting
> for monitors so that they appear more like some common standard though.
> For example, if I determine that my monitor is more like 2.5 rather than
> 2.2, then I can use an xgamma value of 0.9 or 1.1, whichever is correct,
> and things work out nicely.

I think windows doesn't use a corrected framebuffer normally, I'm not sure
though, but that doesn't mean we should not either...

> > If you need to display an image that is in the sRGB color space on a
> > gamma corrected framebuffer you'd need to convert it to linear RGB
> > color space.
> > 
> > That's the way I understand it, please correct me if I'm wrong...
> 
>   Sounds reasonable, but I don't think you should assume that people set
> up their framebuffers to be linear.  That's just nonsensical.  All
> images from digital cameras, images on the web, etc, are almost always
> corrected for a gamma of about 2.2 and assuming that only more minor
> corrections will be done from then on in.

Well, those images can easily be converted to linear gamma, but when
rendering in the framebuffer using OpenGL the result will only be correct
when the framebuffer uses linear color values...

Also, IIRC most workstation _do_ set up the framebuffer to be (nearly)
linear, i.e. somewhere between 1 and 1.5 (instead of 2.2).

--ms