[Openicc] new version of xcalib
graeme at argyllcms.com
Fri Mar 4 14:39:14 EST 2005
Stefan Döhla wrote:
> Additionally, a Win32 version is available. It can be used as an
> alternative to commercial gamma loaders. You may place it in Autostart
> to load the vcgt-content to your video-card's LUT on start-up.
I was having a look around at various OS's, trying to see how each
copes with setting up RAMDAC values, from the point of view
of setting these curves, and being able to manipulate them to
do calibration. If anyone has any comments, corrections, additions
or further pointers to my summary, please add them.
Monitor calibration interfaces on various OS's
Windows NT4 doesn't seem to have support for loading RAMDAC
values. Get/SetDeviceGammaRamp() don't do anything under NT4.
There seems no standard way around this, even if the underlying
video drivers support it, since you can't open the video driver
directly from user or even kernel space to send it an ioctl.
Somehow the Nvidia GeForce4 tools are able to communicate
with the NVidia drivers, since it's user tools have a "Color correction"
pane that sets the Red, Green and Blue curves on my NT4 system. How it
does this is a bit of a mystery.
Presumably on Windows 2000 and XP, Get/SetDeviceGammaRamp() works
The Get/SetDeviceGammaRamp() functions cope with up to 16 bit
table entries (allowing for 10 or more bit RAMDACs).
For Longhorn they are promising something more integrated with the
new Windows Color Architecture (perhaps much like OSX).
Has a whole interface for dealing with this stuff (Quartz services),
and automatically sets the RAMDACs from the selected screen profile.
Unfortunately the CGGet/SetDisplayTransferByTable() functions only seem
to handle 8 bit entries, meaning that RAMDAC capabilities can't be fully
utilized, unless it can be represented as a color "formula".
This seems like a problem for high quality color display.
In theory a Direct Color Visual lets you set the RAMDAC values by setting
the colormap values, and handle up to 16 bits per component. In practice
this interface seems rather oriented to individual applications, rather
that setting global screen parameters, so the XVidModeExtension seems
a more appropriate path, and XF86VidModeSetGammaRamp() apparently handles
up to 16 bits per color component.
It's not clear how RAMDAC tables work with a DVI interface, which
natively seem to be 8 bit per component only. Are the RAMDAC
tables still applied to the DVI digital output ? Is this
Video card dependent behaviour ? Do displays with DVI input
have their own RAMDAC lookups ? If so, how can they be loaded ?
The DVI standard seems to imply that two DVI interfaces can be stacked
to output 16 bits per component, but I'd imagine that video cards
and LCD monitors that support this are rare. In terms of video
levels therefore, it would seem that better control will be had
using the VGA connector, rather than the DVI connector, assuming
the video card has 10 bit or more RAMDACs, and the OS allows setting
greater than 8 bit values, and the display itself has greater than
8 bit ADCs.
DDC often allows various monitor parameters to be adjusted,
such as video presets, video gain, hue, saturation, white point etc.,
all parameters that may need to be adjusted to bring a monitor to
a particular calibration. Access to such DDC function for different
operating systems seems problematic. I've only had a superficial
look, but the impression I get is that there isn't much in the way
of user level access to DDC for any of the 3 main OS environments,
and even kernel level or driver access seems patchy.
Where ACCESS.bus fits into DDC seems a little confusing too.
I also haven't discovered any reference yet indicating that it's
possible to set RAMDAC values using DDC, in monitors that have their
own secondary lookup tables (particularly if they are being driven
by DVI). Since DDC specs. are not publicly available, and can only
be obtained at considerable cost, it's hard to know exactly how it
works, or what it is capable of.
Some monitors have an additional USB connection for similar functions
to DDC. This has been a favourite of LCD monitors connected to
Apple Mac machines. Some LCD monitors in particular (e.g. the Eizo
monitors, see <http://www.eizo.com/products/lcd/l887/index.asp>)
use this method to set up their 10-14 bit lookup tables.
Although there is a USB standard for HID's that includes monitors
there is no mention in it of a standard interface for setting
monitor lookup tables, implying that monitors like the EIZO
are using a proprietary extension or proprietary protocols.
The standard documented USB controls seem to be based on the set
described in the VESA DDC, with few obvious extensions.
Some monitors (for instance EIZO) seem to have proprietary control
cables that allow older OS's (such as NT4) setup and calibrate
the monitor. Electrical or protocol unknown (maybe a serial cable ?)
More information about the openicc