[Openicc] new version of xcalib
Stefan Döhla
color at doehla.de
Fri Mar 4 21:42:44 EST 2005
Hi Graeme and others,
> MSWindows:
> Windows NT4 doesn't seem to have support for loading RAMDAC
> values. Get/SetDeviceGammaRamp() don't do anything under NT4.
As far as I know, these instructions were introduced with ICM2. Win95 and NT
were shipped with ICM1 which lacked many features.
> There seems no standard way around this, even if the underlying
> video drivers support it, since you can't open the video driver
> directly from user or even kernel space to send it an ioctl.
> Somehow the Nvidia GeForce4 tools are able to communicate
> with the NVidia drivers, since it's user tools have a "Color
> correction"
> pane that sets the Red, Green and Blue curves on my NT4 system.
> How it
> does this is a bit of a mystery.
>
> Presumably on Windows 2000 and XP, Get/SetDeviceGammaRamp() works
> as advertised.
These functions must work with certified video card drivers. The
corresponding
function for drivers is //DrvIcmSetDeviceGammaRamp/
<http://msdn.microsoft.com/library/en-us/graphics/hh/graphics/ddifncs_4f81d949-51a0-4d83-b779-e9e950c2851d.xml.asp>
./
> The Get/SetDeviceGammaRamp() functions cope with up to 16 bit
> table entries (allowing for 10 or more bit RAMDACs).
But limited to 256 entries ... ALWAYS!
>
> X11:
> In theory a Direct Color Visual lets you set the RAMDAC values by
> setting
> the colormap values, and handle up to 16 bits per component. In
> practice
> this interface seems rather oriented to individual applications,
> rather
> that setting global screen parameters, so the XVidModeExtension seems
> a more appropriate path, and XF86VidModeSetGammaRamp() apparently
> handles
> up to 16 bits per color component.
Compared to Win32 the size of the gamma ramp is not fixed but
driver-dependant.
E.g. my old Toshiba Portege notebook has only 64 entries per channel.
The user
(in this case xcalib) must downsample to this size. Upsampling could be
possible
as well (but I don't think that there are really devices).
>
>
> Other unknowns:
>
> DVI:
>
> It's not clear how RAMDAC tables work with a DVI interface, which
> natively seem to be 8 bit per component only. Are the RAMDAC
> tables still applied to the DVI digital output ? Is this
> Video card dependent behaviour ? Do displays with DVI input
> have their own RAMDAC lookups ? If so, how can they be loaded ?
Since the RAMDAC works in the digital domain only, I don't expect any
difference.
> The DVI standard seems to imply that two DVI interfaces can be
> stacked
> to output 16 bits per component, but I'd imagine that video cards
> and LCD monitors that support this are rare. In terms of video
> levels therefore, it would seem that better control will be had
> using the VGA connector, rather than the DVI connector, assuming
> the video card has 10 bit or more RAMDACs, and the OS allows setting
> greater than 8 bit values, and the display itself has greater than
> 8 bit ADCs.
I don't really know which axis is extended for 10bits: The number of bits
per entry or the number of entries. So far, both are 8bit in most cases.
Both would allow better level resolution - with different advantages.
>
> DDC:
>
> DDC often allows various monitor parameters to be adjusted,
> such as video presets, video gain, hue, saturation, white point etc.,
> all parameters that may need to be adjusted to bring a monitor to
> a particular calibration. Access to such DDC function for different
> operating systems seems problematic. I've only had a superficial
> look, but the impression I get is that there isn't much in the way
> of user level access to DDC for any of the 3 main OS environments,
> and even kernel level or driver access seems patchy.
> Where ACCESS.bus fits into DDC seems a little confusing too.
I don't agree on this. NEC published DDC/CI which is a add-on for the
DDC standard. Even for Linux we have rudimentary apps which support some
DDC features (contrast and other unnecessary items).
>
> I also haven't discovered any reference yet indicating that it's
> possible to set RAMDAC values using DDC, in monitors that have their
> own secondary lookup tables (particularly if they are being driven
> by DVI). Since DDC specs. are not publicly available, and can only
> be obtained at considerable cost, it's hard to know exactly how it
> works, or what it is capable of.
DDC can be seen as I2C over VGA and DVI cables. A unused line is used for
both. For Linux one would need the i2c modules being loaded and a
userspace app that knows which memory addresses are to be set for the
displays LUTs. If someone has a NEC UX**80 and some electronic equipment
he can possibly get this data. I don't like it as well, that the specs
aren't published for free.
>
> USB:
> Some monitors have an additional USB connection for similar functions
> to DDC. This has been a favourite of LCD monitors connected to
> Apple Mac machines. Some LCD monitors in particular (e.g. the Eizo
> monitors, see <http://www.eizo.com/products/lcd/l887/index.asp>)
> use this method to set up their 10-14 bit lookup tables.
> Although there is a USB standard for HID's that includes monitors
> (see <http://www.usb.org/developers/devclass_docs/HID1_11.pdf>
> and <http://www.usb.org/developers/devclass_docs/usbmon10.pdf>),
> there is no mention in it of a standard interface for setting
> monitor lookup tables, implying that monitors like the EIZO
> are using a proprietary extension or proprietary protocols.
> The standard documented USB controls seem to be based on the set
> described in the VESA DDC, with few obvious extensions.
It's proprietary for sure - I wouldn't bother with them because DDC/CI is
definetely the better way to do it.
Stefan
More information about the openicc
mailing list