2D antialiased graphics using OpenGL
Martijn Sipkema
msipkema@sipkema-digital.com
Thu, 11 Dec 2003 23:49:28 +0100
> > > > I think windows doesn't use a corrected framebuffer normally, I'm
not
> > > > sure though, but that doesn't mean we should not either...
> > >
> > > You're trying to politicize gamma correction?
> >
> > Well, yes! :)
>
> Good, glad that's out in the open. ;-)
>
> I mean, you make a reasonable argument. If framebuffers were linear
> intensity values rather than voltage values, a lot of things would be
> easier, including being able to turn 'naive' hardware implementations of
> algorithms such as compositing to be correct. All of this is true.
>
> But I don't buy the argument that OpenGL expects a linear framebuffer.
> When I specify a colour as an RGB triple in OpenGL, I am pretty sure
> that it expects gamma corrected values.
Actually, I don't think OpenGL really cares about gamma, but the way
blending and smooth shading is defined is only correct for a gamma
corrected frame buffer. It is _impossible_ to do rendering in a nonlinear
color space without a loss in performance, e.g. write only algorithms
would now need to be read-write. It is IMHO better to just use a
higher resolution framebuffer.
> I will try and find some better
> references. I mean, the difference between rendering gradients and
> compositing in a non-gamma corrected space and doing the correct
> transform are visible and annoying, but not to the casual observer,
Most likely just as noticable as the perceived loss in resolution from
having the framebuffer gamma corrected..
> and
> it's expensive to correct in hardware. Hence, we have the situation we
> are currently in, which is not as bad as you make it out to be.
Still, I don't see what is wrong with expecting the framebuffer to be
in linear color space. If 8 bits of color os not enough, then use hardware
with a higher resolution framebuffer, not throw out all hardware
acceleration
for slightly better perceived resolution.
--ms