[Openicc] New options on the mainline

Graeme Gill graeme at argyllcms.com
Tue Jan 22 10:00:15 PST 2008


edmund ronald wrote:
> On Jan 21, 2008 4:31 AM, Graeme Gill <graeme at argyllcms.com> wrote:
> 
>> My current favourite idea for linearizing each channel would
>> be to measure the L*a*b* for a step wedge test chart, and then
>> linearize with respect to the distance along the response locus.
>> This evens out the delta E change for a change in the channel input.
>> Such an approach may be a little hard to compute, and makes it
>> hard to clip non-monotonicity, so a slight compromise
>> would be to fit a straight line to the response points (least
>> squares), and then measure the delta E projected onto that
>> line.
> I'm not quite sure why one shouldn't just work with densities here,
> leaving the colorimetric issues for the profile engine ? 

There's no doubt that density readings can form very good
process control readings (which is essentially what calibration
is all about), but there are practical reasons why an open
system may be better of with colorimetric readings.

One is that density readings are process (ie. colorant
formulation) specific. That's why there are all those different
status standards. So it's not easy to adapt a specific density
set to arbitrary N-color ink sets, nor to a particular
manufacturers ink formulation.

A second issue is that getting density readings either requires
a dedicated densitometer, or a spectrometer. Cheaper colorimeter
instruments can't be used for density measurements, and such
instruments would be far more desirable at the low end, where
a user might want to both calibration and profile.

A third issue is signal to noise ratio of the readings. Density
measurements are narrow band measurements, which means that
only a small portion of the wavelengths available are used
for creating a reading. This is throwing away information,
and makes the readings less repeatable under the same circumstances.
[I would guess that density measurements are formulated this way
  to facilitate reading ink density value from overprints, where
  the maximum separation of the different ink density is required,
  as well as trying to make the reading as insensitive to slight
  hue variation in the inks as possible.]

A fourth reason is that (I think) it is the most desirable
result where an even change in the device channel value results
in an equal change in perceived color. This really requires
subjective color values, and L*a*b* is a step in this direction.

 > Define a set
> of target densities for the steps, measure the device with ink
> limitation already applied,  and see how the desired values can be
> achieved by inverting the measured density curves, display the
> measurements and the suggested corrections to the user and allow him
> to edit if desired ?

Some degree of flexibility may be desirable for high end users,
but really it shouldn't be required most of the time. Remember
that the calibration should have no direct effect on the
end result, since it's within the closed loop of the profiling
system. It will have secondary quality effects, and affects
the quality (ie. accuracy and smoothness) of the resulting
profiles, but not the color "look" that is being targeted.
The latter is primarily determined by the input colorspace
profile and the gamut mapping.

 > At this point we're really working in ink-space
> more than in color space. Sorry if this approach is very primitive,
> but I think this is what existing RIP engines do.

I think I've got a reasonable handle on what commercial
RIP engines do :-)

Graeme Gill.


More information about the openicc mailing list