2D antialiased graphics using OpenGL
Billy Biggs
vektor@dumbterm.net
Thu, 11 Dec 2003 15:16:56 -0600
Martijn Sipkema (msipkema@sipkema-digital.com):
> > > But how can one do compositing when gamma is not 1? AFAIK graphics
> > > hardware expects a framebuffer with a gamma of 1, i.e. a smooth
> > > shading will not render gamma corrected in hardware, right?
> > >
> > > I'd rather have convenient, fast rendering and suffer slightly
> > > worse resolution in some color range. If 8 bit per channel is not
> > > enough, then the hardware should provide more. OpenGL rendering
> > > clearly expects a gamma == 1 framebuffer I think...
> >
> > What do you mean when you say 'a gamma of 1'? I don't think I
> > understand your post.
> >
> > I render pixels in my application for sRGB, which uses a gamma of
> > 2.2 and a funky power function as a model of modern LCD panels and
> > CRTs. This standard seems well accepted. How does this relate to
> > 'a gamma of 1', are you saying that when I give pixels to OpenGL,
> > that I should populate texture pixels as sRGB values, or that I
> > should reverse my sRGB transform?
> >
> > Like, consider the case of taking any standard PNG file and loading
> > it into a texture. What do you see happening ideally?
>
> From what I understand the sRGB color space color values should be
> interpreted as voltages for a CRT monitor, which has a gamma of 2.2.
> Thus if you copy an sRGB image to the framebuffer it will only be
> displayed correctly if the graphics hardware interprets framebuffer
> RGB values as voltage. This is the default for XFree86 I think, but
> you can adjust the framebuffer 'gamma' when the hardware supports this
> using xgamma, and most hardware does.
OK, we are in agreement here but you seem to take things pretty
literally. I mean, 'interpreted as voltages' seems a bit harsh, and
it's used by LCD panels as well as CRTs. But fair enough.
> If your CRT monitor has a gamma of 2.2 then setting the framebuffer
> 'gamma' to 2.2 using xgamma will give correct output of the
> framebuffer if the pixels in it are to be interpreted as linear, i.e.
> describing the light intensity. This means that blending and smooth
> shading in OpenGL will work as expected.
Yes that would be true, but nobody would use a gamma value of 2.2 with
xgamma, that would be ridiculous. xgamma is a simple way of correcting
for monitors so that they appear more like some common standard though.
For example, if I determine that my monitor is more like 2.5 rather than
2.2, then I can use an xgamma value of 0.9 or 1.1, whichever is correct,
and things work out nicely.
> If you need to display an image that is in the sRGB color space on a
> gamma corrected framebuffer you'd need to convert it to linear RGB
> color space.
>
> That's the way I understand it, please correct me if I'm wrong...
Sounds reasonable, but I don't think you should assume that people set
up their framebuffers to be linear. That's just nonsensical. All
images from digital cameras, images on the web, etc, are almost always
corrected for a gamma of about 2.2 and assuming that only more minor
corrections will be done from then on in.
-Billy