[Openicc] [Gimp-print-devel] Drop size calibration

Robert Krawitz rlk at alum.mit.edu
Sun Jan 27 18:30:36 PST 2008


   Date: Sun, 27 Jan 2008 18:24:24 -0800
   From: Michael Sweet <mike at easysw.com>

   Robert Krawitz wrote:
   > ...
   > Another question: in the long run, do you think 16 bits of input
   > precision are sufficient, or should we be moving to 31 or 32 bits?
   > We have a lot of 16-bit assumptions in the data path, and if we should
   > be moving to higher bit depths, it's something we'd need to look at
   > closely.

   Mathematically you need a patch of at least 128x128 dots (for 3 drop
   sizes) to represent 16-bits worth of levels and 32768x32768 dots (or
   about 11x11 inches at 2880 DPI!) for 32-bits worth.  In practice you
   need more than that thanks to dot gain effects, making 32-bit
   accuracy unpractical.  Even 16-bit accuracy is only achievable with
   lower image resolutions (about 22.5 PPI for a 2880 DPI printer)...

That's assuming that your driver is perfectly linear, and that there's
no precision loss along the way.  In practice, you want some excess
precision to make up for those losses.

8 bits of perfectly linear precision might be OK for printing, but the
stuff inside the box needs more.


More information about the openicc mailing list