[Bug 16001] XVideo gamma curve is wrong at least for r300 chips

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Tue May 20 04:44:20 PDT 2008


http://bugs.freedesktop.org/show_bug.cgi?id=16001





--- Comment #6 from Roland Scheidegger <sroland at tungstengraphics.com>  2008-05-20 04:44:19 PST ---
(In reply to comment #5)
> > I know there were some earlier complaints about gamma tables being wrong, but
> > IIRC noone really investigated this a bit deeper.
> 
> I never noticed the problem on my r200 and the r300 with my old TFT (6bit S-IPS
> panel + dither), but on my new Samsung (8bit S-PVA panel) it much easier to
> spot .
How did you generate the test image to see this? I've got a r200 here (and the
panel should be 8bit s-pva too) and could look at it.

> > Maybe we should try to get rid of the tables entirely for
> > r200/r300 and just calculate all the segment values as needed, with arbitrary
> > gamma value.
> 
> Something like this?  (Still my debugging stuff, but works for me :))
Yes, that's what I had in mind (of course without the debug xvattr values...).
Not sure it's entirely correct, where does that 1000 divide come from?
Shouldn't that be 1023.5 (the range of the values)? Though to get rid of the
tables completely, the OvGammaCont value would also need to be calculated - I
wouldn't know how though it almost looks linear (with the maximum reached at
gamma 1.0 and the minimum at about gamma 2.1).


-- 
Configure bugmail: http://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.


More information about the xorg-driver-ati mailing list