color calibration and xvideo (xv)

Alex Deucher alexdeucher at gmail.com
Wed Aug 22 12:02:00 PDT 2007


On 8/22/07, Hal V. Engel <hvengel at astound.net> wrote:
> On Wednesday 22 August 2007 08:06:55 Alex Deucher wrote:
> > There are several problems. The first is that IIRC, Xv attributes can
> > only be integers and secondly, there were no standardized attributes.
> > As such the implementations tend to vary by driver. Most modern
> > overlays have extensive configurability. radeon has a fully
> > adjustable linear transform engine for color space conversion and a
> > multi-point gamma curve. the problem is there's not really a good
> > common way expose the full capabilities as attributes. Take a look at
> > RADEONSetTransform() in radeon_video.c if you want to get an idea of
> > how the hardware works.
>
> It would appear that the ref parameter to RADEONSetTransform() is basically
> for setting a formula based transfer curve.  But there are several issues as
> this relates to proper calibration of the diplay output.  First the same
> formula is used for all channels.  When calibrating a display all three
> display channels need to have individualized tone curves (either formula
> based or table based).  The reason to do this is to get all three channels to
> be neutral (IE. on the black locus - most users will select 6500K) when R=G=B
> through as much of the tone curve as possible.  Second the tables (there are
> only two of these) are hardcoded and can not be changed by the user.  So
> there is no way for the display calibration software to alter these curves.
>
> The gamma setting code seems a little closer in that they appear to be table
> based with slope and offset pairs.  But again the gamma tables are hard coded
> so user software can not alter the values and second there is no way to set
> the gamma tables for individual color channels (IE. there is only one table
> for all three channels).
>

Heh... patches welcome.  I'm not really much of an expert when it
comes to color calibration.  I'm not sure what the best method is for
calibrating overlays.  I think most (all?) vendors keep the graphics
plane and the overlay plane(s) separate so they can be adjusted
individually.  Plus the overlay can be sourced to either crtc (or
directly to an output in the radeon case) so you might have different
calibrations on each output.

> In addition, the user_gamma parameter to RADEONSetTransform() is converted
> from (logical) values that are in the range of 0.85 to 2.5 into an index that
> goes from 0 to 7 (IE. 8 gamma settings for a parameter that should be
> infinitely variable over it's valid range).  Calibration algorithms in
> software like LProf (and I think ArgyllCMS has similar goals) try to get the
> LUT corrected measured device gamma to be with in 0.03 of the user specified
> gamma.  That is if the user wants an actual gamma of 2.2 he/she should get
> something between 2.17 and 2.23 on all three channels after calibration.
> Usually this type of software will get to with in +- 0.01 of the desired
> gamma on all three channels.  A typical uncalibrated display will have the
> gamma for the three channels vary by  0.4 or more and will typically have an
> overall gamma around 2.5 which is too high.  The gamma code in the RADEON
> driver has gaps in the range of gamma values as large as 0.5 which is almost
> two orders of magnitude larger than the error level typically achieved by
> when calibrating a display using software like LProf and ArgyllCMS.  Also is
> this like the gamma setting in X were 1.0 means don't change the default
> display gamma or does it have some other meaning?
>

Generally the Xv attributes for overlays are just reflections of the
the hw and the algorithms needed to program the hw.  The defaults are
set a general level that looks decent.

> Some of the replies to this thread indicated that this might at some point be
> changed (IE. "Legacy XV overlays usually ignores LUT...").  If that is the
> case then great (are there any current examples of a non-legacy XV driver
> that that uses the LUTs?).  If not then there is a real problem with this.
> Those users that spend money on calibration hardware and go to the trouble to
> keep their displays calibrated have a very reasonable expectation that
> software will not by pass their calibration.   At least not without them
> explicitly telling the software to do so.  Users with hardware calibrated
> displays will become much more common on X server machines as we move forward
> since software with the needed measurement instrument support is now
> available from at least two sources and the cost of the measurement hardware
> is now very reasonable.
>

All older video hardware uses an separately controlled overlay at this
point for Xv.  Some of the newer hardware uses shaders or YUV textures
to implement Xv.  These will respect the crtc's LUT setting because
the the data actually gets written to the main framebuffer and scanned
out along with the rest of the desktop.  Overlays are stored in
separate framebuffer and the hw switches between the graphics plane
and the overlay during scan out.


> Also to clarify an apparent point of confusion.  Profiling and calibration are
> two different processes.  Calibration involves making adjustments to the
> device to bring it into a known or defined state (white point, white level,
> black point, black level, gamma...).  For a display those adjustments include
> adjusting the front panel or on screen controls as well as creating a custom
> video card LUT.  Profiling is a characterization process that results in the
> creation of a file (an ICC profile in this case) that has a detailed
> description of the calibrated displays color and tonal characteristics.
>
> Part of the reason for the confusion is that most display profiling software
> also allows for display calibration and the profiling software also normally
> embeds the LUT calibration data in the profile for use by a LUT loader such
> as xcalib or dispwin.
>
> In the long run video play back should not bypass the display calibration and
> it should also be using the correct display profile so that it is correctly
> transforming it's images into the displays color space.  But again these are
> two separate issues and for now I suspect that most users with calibrated
> displays would be happy if the displays calibration settings were respected.

Once again patches welcome.  While I can see the need for setting the
the overlay to match the crtc LUT, I also think users would like to be
able to tweak the overlay separately (for example if they are viewing
crappy source material that is too dim or bright, etc.) without
affecting their whole desktop.  You guys that are working on the color
management stuff need to let us know what you need from the windowing
system.

Alex



More information about the xorg mailing list