[Intel-gfx] 8-bit vs. 10-bit palette mode, and LVDS dithering
Peter Clifton
pcjc2 at cam.ac.uk
Sat Apr 24 18:25:10 CEST 2010
Hi guys,
I noticed an anomaly in the register settings on my GM45, according to
PIPEBCONF, it is set in 10-bit palette mode, yet the KMS code programs
the palette registers as if it is in 8-bit mode, and doesn't touch the
PIPEBGCMAX{RED,GREEN,BLUE} entries for the final interpolation step.
I've played with it, and I "think" my screen looks nicer with the
palette reset to 8-bit mode. The KMS driver never touches this register
normally, and I think it should reset it in intel_display.c's
intel_crtc_load_lut.
Ideally, it would be nice to have a higher resolution gamma correction.
Will it work if I drm_mode_crtc_set_gamma_size(&intel_crtc->base, 129);
and feed that into the 10-bit linear-interpolated lookup table?
At what stage is the LVDS dithering involved here.. gamma correction
probably makes a difference to whether a dither works nicely or not, and
I'm wondering whether I can tune the dithering process at all.
(I've got an HP laptop with a really bad colour profile, so it needs a
fair bit of correction to make colours appear properly. The hardware
hacker in me wonders if there is an EEPROM of gamma correction constants
on the panel which could be "fixed", but sadly litte data is available
from the panel manufacturer).
Regards,
--
Peter Clifton
Electrical Engineering Division,
Engineering Department,
University of Cambridge,
9, JJ Thomson Avenue,
Cambridge
CB3 0FA
Tel: +44 (0)7729 980173 - (No signal in the lab!)
Tel: +44 (0)1223 748328 - (Shared lab phone, ask for me)
More information about the Intel-gfx
mailing list