[Intel-gfx] rfc: breaking old userspace gamma for 10-bit support
luto at mit.edu
Wed Apr 20 12:14:50 PDT 2011
On Wed, Apr 20, 2011 at 3:05 PM, Jesse Barnes <jbarnes at virtuousgeek.org> wrote:
> On Tue, 27 Jul 2010 11:03:56 -0400
> Adam Jackson <ajax at redhat.com> wrote:
>> On Fri, 2010-07-23 at 16:29 -0400, Andrew Lutomirski wrote:
>> > Does that include not breaking DirectColor? If we program the gamma
>> > ramp to 129 slots, old userspace submits 256 entries that are not
>> > monotonic, and we decimate the gamma ramp, we'll display the wrong
>> > thing. I have no idea if there are any programs *at all* that do
>> > that, though. (If they did, presumably they'd make the entire screen
>> > look rather odd.)
>> If I'm remembering this right, it's like this:
>> GM45 and earlier can only do 10bpc with 10-bit, 129-stop interpolated
>> LUT. There is no way to support DirectColor with this, the hardware
>> assumes that each step is monotonically increasing and will do very
>> weird things if it's not. So the DDX driver needs to simply not set up
>> any DC visuals when run at 10bpc on this hardware. That's fine though:
>> DC is a pretty rarely-used feature, and it brings with it all the
>> colormap-flashing nightmares you remember from pseudocolor.
>> Ironlake and later can do 30bpc as either 10-bit 1024-stop, or 12-bit
>> 512-stop interpolated. For the 12-bit ramp you'd need to do the same as
>> for the GM45 10-bit ramp: DDX driver doesn't set up DC visuals.
>> Once you've done this, the DRM driver can unambiguously determine what
>> gamma ramp size to use based on what DDX passes down. At least in
>> current RANDR you only get to pick one ramp size. You could in
>> principle wire up an output property to change this, but I suggest that
>> you don't bother.
>> There are some places in the server where we assume 256 stops, and some
>> other places where we make the weaker assumption that all CRTCs have the
>> same ramp size.
>> Having done all that you'd need to go out to gnome-color-manager and
>> friends and make sure they don't assume they have 2^n stops of gamma for
>> an n bpc display.
> [Resurrecting this ancient thread now that 30bpp has been posted]
> I think we'll also want a kernel getparam flag to indicate whether the
> kernel can handle the larger ramp sizes, otherwise a new DDX and old
> kernel will silently break horribly.
> Current DDX always assumes a 256 entry ramp, and we'd want to preserve
> that for old kernels w/o support for other sizes. For new DDX and new
> kernel, we should be able to key the palette load based on the size
> passed in to the gamma_set callback. For some reason that callback
> includes a 'start' value, but fortunately the ioctl makes you load the
> whole thing everytime, so the start value is always 0.
> Sounds like dealing with chipset variations will be a bit of a pain
> though, so we'll want per-chipset gamma_set functions at least.
> Andrew, do you have anything hacked together for this yet?
Nope. I gave up because I couldn't even get the mode to set. :)
One issue was that the RandR apis aren't really designed for cards
that can accept more than one gamma ramp size. Someone (I forget who)
suggested adding a display property to control it. It might be
possible to kill two birds with one stone by adding a property with
- Low depth: the logic you implemented: the bit depth is set to match
the framebuffer when possible and the gamma ramp size is set according
to the framebuffer depth.
- High depth: the bit depth is set to the maximum that the encoder,
connector, and monitor support at the requested resolution and the
gamma ramp size is set to whatever gives the highest precision in each
Presumably, people would prefer the low depth mode if they care about
DirectColor (does anyone care about DirectColor?) or if they want to
save a little bit of power and people would use high depth if they
plan to use the gamma ramp for calibration.
> Jesse Barnes, Intel Open Source Technology Center
More information about the Intel-gfx