color calibration and xvideo (xv)

Hal V. Engel hvengel at astound.net
Wed Aug 22 11:09:03 PDT 2007


On Wednesday 22 August 2007 08:06:55 Alex Deucher wrote:
> There are several problems.  The first is that IIRC, Xv attributes can
> only be integers and secondly, there were no standardized attributes.
> As such the implementations tend to vary by driver.  Most modern
> overlays have extensive configurability.  radeon has a fully
> adjustable linear transform engine for color space conversion and a
> multi-point gamma curve.  the problem is there's not really a good
> common way expose the full capabilities as attributes.  Take a look at
> RADEONSetTransform() in radeon_video.c if you want to get an idea of
> how the hardware works.

It would appear that the ref parameter to RADEONSetTransform() is basically 
for setting a formula based transfer curve.  But there are several issues as 
this relates to proper calibration of the diplay output.  First the same 
formula is used for all channels.  When calibrating a display all three 
display channels need to have individualized tone curves (either formula 
based or table based).  The reason to do this is to get all three channels to 
be neutral (IE. on the black locus - most users will select 6500K) when R=G=B 
through as much of the tone curve as possible.  Second the tables (there are 
only two of these) are hardcoded and can not be changed by the user.  So 
there is no way for the display calibration software to alter these curves.

The gamma setting code seems a little closer in that they appear to be table 
based with slope and offset pairs.  But again the gamma tables are hard coded 
so user software can not alter the values and second there is no way to set 
the gamma tables for individual color channels (IE. there is only one table 
for all three channels).  

In addition, the user_gamma parameter to RADEONSetTransform() is converted 
from (logical) values that are in the range of 0.85 to 2.5 into an index that 
goes from 0 to 7 (IE. 8 gamma settings for a parameter that should be 
infinitely variable over it's valid range).  Calibration algorithms in 
software like LProf (and I think ArgyllCMS has similar goals) try to get the 
LUT corrected measured device gamma to be with in 0.03 of the user specified 
gamma.  That is if the user wants an actual gamma of 2.2 he/she should get 
something between 2.17 and 2.23 on all three channels after calibration.  
Usually this type of software will get to with in +- 0.01 of the desired 
gamma on all three channels.  A typical uncalibrated display will have the 
gamma for the three channels vary by  0.4 or more and will typically have an 
overall gamma around 2.5 which is too high.  The gamma code in the RADEON 
driver has gaps in the range of gamma values as large as 0.5 which is almost 
two orders of magnitude larger than the error level typically achieved by 
when calibrating a display using software like LProf and ArgyllCMS.  Also is 
this like the gamma setting in X were 1.0 means don't change the default 
display gamma or does it have some other meaning?

Some of the replies to this thread indicated that this might at some point be 
changed (IE. "Legacy XV overlays usually ignores LUT...").  If that is the 
case then great (are there any current examples of a non-legacy XV driver 
that that uses the LUTs?).  If not then there is a real problem with this.  
Those users that spend money on calibration hardware and go to the trouble to 
keep their displays calibrated have a very reasonable expectation that 
software will not by pass their calibration.   At least not without them 
explicitly telling the software to do so.  Users with hardware calibrated 
displays will become much more common on X server machines as we move forward 
since software with the needed measurement instrument support is now 
available from at least two sources and the cost of the measurement hardware 
is now very reasonable.

Also to clarify an apparent point of confusion.  Profiling and calibration are 
two different processes.  Calibration involves making adjustments to the 
device to bring it into a known or defined state (white point, white level, 
black point, black level, gamma...).  For a display those adjustments include 
adjusting the front panel or on screen controls as well as creating a custom 
video card LUT.  Profiling is a characterization process that results in the 
creation of a file (an ICC profile in this case) that has a detailed 
description of the calibrated displays color and tonal characteristics.  

Part of the reason for the confusion is that most display profiling software 
also allows for display calibration and the profiling software also normally 
embeds the LUT calibration data in the profile for use by a LUT loader such 
as xcalib or dispwin.

In the long run video play back should not bypass the display calibration and 
it should also be using the correct display profile so that it is correctly 
transforming it's images into the displays color space.  But again these are 
two separate issues and for now I suspect that most users with calibrated 
displays would be happy if the displays calibration settings were respected.

Hal




More information about the xorg mailing list