[Openicc] [Gimp-print-devel] Drop size calibration

Kai-Uwe Behrmann ku.b at gmx.de
Mon Jan 28 23:20:37 PST 2008


Am 27.01.08, 22:29 -0500 schrieb Robert Krawitz:

>    Date: Sun, 27 Jan 2008 19:18:40 -0800
>    From: Michael Sweet <mike at easysw.com>
> 
>    Robert Krawitz wrote:
>    > ...
>    > That's assuming that your driver is perfectly linear, and that there's
>    > no precision loss along the way.  In practice, you want some excess
>    > precision to make up for those losses.
> 
>    Right, but those losses typically happen in the color management/
>    conversion code, not the dither code which is reducing the input
>    to 1 or 2 bits.
> 
> Sure, but if there are linearization problems in the driver that have
> to be corrected in external color management, you can lose some of
> that precision.
 
I have seen banding in 8-bit RGB gray scale gradients printed with a 
colourmanaged E2100. The bands disappeared to the degree of Gutenprints 
dithering with 16-bit. I would expect a more actual device (2400) 
printout would give more of an idea if there is more needed.
The question would as well be where to draw the line of sufficient 
precission if at all.

Some say 8-bit is theoretical enough (because of the magic 1dE myth).
Some say 16-bit, this would include me.
Who else says 32-bit (probably as reference implementation?)

Given that there exist 16 and more bit input devices, the material should 
be in place already.


kind regards
Kai-Uwe Behrmann
--
developing for colour management 
www.behrmann.name + www.oyranos.org



More information about the openicc mailing list