[OpenICC] monitor/videocard identification (fwd)

Kai-Uwe Behrmann ku.b at gmx.de
Tue Dec 7 09:34:14 EST 2004


As long as DVI is sticking to 8-bit an gamma table transported via DDC to
the LCD should produce better quality.
Nevertheless Xlib provides the XF86VidModeSetGammaRamp function, which
makes an basic gray scale correction in the graphic card possible. I have
seen dE of 1-2 by this approach. The price is an growing quantisation. The
remaining matrix transformation must then be done by the CMM - but where.

Of course an complete switch to send an ICC monitor profile to X would
simplify colour handling for most applications. It should then be possible
to specify the input colour space for each window separately.

With differnt backends working in parallel, it seems to become more
complex to control colour appearance. An common ground level for colour
correcting different approaches (GL, X ...) is from the viewpoint of an
CMS desireable.

regards
Kai-Uwe Behrmann
                                + imaging development / panoramas
                                + color management
                                + email: ku.b at gmx.de
                                + CMS proposal <www.behrmann.name>


Am 06.12.04, 14:02 -0600 schrieb Bob Friesenhahn:

> On Sun, 5 Dec 2004, Sven Neumann wrote:
>
> > Hi,
> >
> > Kai-Uwe Behrmann <ku.b at gmx.de> writes:
> >
> >> I suspect that we will end up being stuck with associating a profile
> >> with a simple DISPLAY ID, with unqualified DISPLAY IDs being
> >> prefixed with the current host's IP address or hostname.
> >
> > A display may have multiple screens and each screen may have multiple
> > monitors. How would you deal with that?
>
> The only solution which is assured to work requires extension of the
> X11 APIs, X11 protocol (to support the APIs), and the frame buffer
> device driver interface.
>
> The application would tell X11 which profile describes its colorspace.
> The X11 server would tell the device driver which profile describes
> the colorspace of the data provided to it.  The frame buffer device
> driver would generate mapping tables (possibly loaded into the
> hardware for performance) to convert between the application's
> colorspace, and the colorspace of the attached display.  If the frame
> buffer device driver controls multiple display frame buffers, then it
> will need to manage a translation table for each frame buffer.
>
> Given that X11 servers do not yet support the necessary capabilities,
> there still needs to be a simple mode (likely tied to the DISPLAY
> identifier) to help X11 applications deal with existing servers.  In
> this case, the X11 application (or X11 support library) needs to
> perform the colorspace transformations, or use an existing X11 server
> extension to load tables to perform the colorspace transformations.
> This approach does not properly support virtualized multi-headed
> displays.
>
> The assumption is always that there is one display per frame buffer.
> If there is more than one display per frame buffer, then the displays
> need to be internally calibrated to match each other.
>
> Given that the world is transitioning from analog to digital (e.g.
> DVI) connection to displays, it should be expected that at some point
> all of the translation related to the display will move to within the
> display itself in order to achieve the best color resolution.  Modern
> LCD displays already contain translation tables (typically 10 bit) in
> order to emulate CRT gamma and support user-adjustments, but they
> can't usually be updated from the host computer.
>
> Bob
> ======================================
> Bob Friesenhahn
> bfriesen at simple.dallas.tx.us
> http://www.simplesystems.org/users/bfriesen




More information about the openicc mailing list