[PATCH xserver 1/5] modesetting: Don't call xf86HandleColorMaps() at screen depth 30.

Michel Dänzer michel at daenzer.net
Thu Feb 8 14:55:53 UTC 2018


On 2018-02-08 12:14 PM, Mario Kleiner wrote:
> As it turns out, doing so will make any gamma table updates
> silently fail, because xf86HandleColorMaps() hooks the
> .LoadPalette function to xf86RandR12LoadPalette() if the
> .gamma_set function is supported by the ddx, as is in our
> case.
> 
> Once xf86RandR12LoadPalette() has been called during server
> startup, all palette and/or gamma table updates go through
> xf86RandR12CrtcComputeGamma() to combine color palette
> updates with gamma table updates into a proper hardware lut
> for upload into the hw via the .gamma_set function/ioctl,
> passing in a palette with palette_red/green/blue_size == the
> size given by the visuals red/green/blueMask, which will
> be == 1024 for a depth 30 screen.
> 
> That in turn is a problem, because the size of the hw lut
> crtc->gamma_size is fixed to 256 slots on all kms-drivers
> when using the legacy gamma_set ioctl, but
> xf86RandR12CrtcComputeGamma() can only handle palettes of a
> size <= the hw lut size. As the palette size of 1024 is greater
> than the hw lut size of 256, the code silently fails
> (gamma_slots == 0 in xf86RandR12CrtcComputeGamma()).
> 
> Skipping xf86HandleColormaps() on a depth > 24 screen disables
> color palette handling, but keeps at least gamma table updates
> via xf86vidmode extension and RandR working.

Sort of... It means xf86VidMode and RandR call directly into the driver
to set their LUTs, with no coordination between them, so whichever calls
into the driver last wins and clobbers the HW LUT.

It would be better to fix xf86RandR12CrtcComputeGamma instead.


-- 
Earthling Michel Dänzer               |               http://www.amd.com
Libre software enthusiast             |             Mesa and X developer


More information about the xorg-devel mailing list