[Intel-gfx] Fwd: State of 10 bits/channel?
Andrew Lutomirski
luto at mit.edu
Thu Jul 22 23:34:09 CEST 2010
[resend b/c I used the wrong from address last time]
I have a 10 bit/channel monitor (DisplayPort) which works quite nicely
in 8 bit mode. I saw some patches from Peter Clifton related to 10
bit support go in awhile ago.
There are (at least) three modes that would be nice:
(1) 8bpp framebuffer, 8 bit outputs, but 10-bit LUT with dithering.
(2) 8bpp framebuffer, 10 bit outputs and LUT.
(3) 10 bpp framebuffer, outputs, and LUT.
(1) would be nice with any hardware -- color calibration would look
better. (2) would be a good start for 10 bit displays -- I could
calibrate without banding and userspace would be none the wiser
(except for a different-looking gamma ramp). (3) would be really cool
and would differentiate us nicely from Windows, which AFAICT doesn't
really support 10 bit outputs on most (all?) hardware.
What is the hardware capable of, and what is the state of affairs
right now? I'm running 2.6.35-rc4+ with a hacked up xf86-video-intel
with this patch:
diff --git a/src/intel_driver.c b/src/intel_driver.c
index 7761ccf..d0d1a37 100644
--- a/src/intel_driver.c
+++ b/src/intel_driver.c
@@ -570,6 +570,7 @@ static Bool I830PreInit(ScrnInfoPtr scrn, int flags)
case 15:
case 16:
case 24:
+ case 30:
break;
default:
xf86DrvMsg(scrn->scrnIndex, X_ERROR,
(Otherwise, DefaultDepth 30 won't start at all.)
With that patch and DefaultDepth 30, I get a mostly working system,
but there's no direct rendering (seems to be disabled because "DRI is
disabled because it runs only at depths 16 and 24") title bars on
gnome-terminal draw incorrectly.
Do any of you know how to ask the system what depth the output is
configured at and what depth the framebuffer is configured at?
Currently, XRRGetCrtcGammaSize return 256, which IIRC should be 129 if
10 bit gamma ramps are being used. (That's on both CRTCs, one of
which is DP connected to the 10 bit device.)
--Andy
More information about the Intel-gfx
mailing list