Monitor profiling (Re: [RFC wayland-protocols] Color management protocol)
Pekka Paalanen
ppaalanen at gmail.com
Fri Jan 13 14:17:53 UTC 2017
Hi,
there is some controversy in whether this belongs in Wayland or not,
but if we assume that it does...
On Thu, 5 Jan 2017 12:40:08 +1100
Graeme Gill <graeme2 at argyllcms.com> wrote:
> Pekka Paalanen wrote:
>
> > Designing that is trivial:
>
> I'm not so sure.
>
> > GLOBAL cms_calibrator
> > - request: create_calibration_surface(wl_surface, new cms_calibration_surface)
> > # Assigns wl_surface role.
> >
> > INTERFACE cms_calibration_surface
> > # Surfaces with this role will only be shown on the set output,
> > # with direct color path bypassing all color-management, and
> > # and the hardware has been reset to neutral/identity settings.
> > # (or whatever requirements are appropriate, you can decide
> > # what to write here)
>
> Why this has to be made a special case? The normal
> machinery used to manage color is capable of
> configuring things to be in a proper state for calibration
> and profiling (if this was not the case, then it is not
> truly able to do the color management!)
So you say, but then you continue...
> Due to the different bit depth of the VideoLUT entries and the
> frame buffer, it is expected that it is possible to set
> the VideoLUT value for the entry that corresponds
> with the values set in the frame buffer (i.e. classically
> 10 bit VideoLUT entry depth in 8 bit frame buffer),
> so that the test patch values can be of the same precision
> as the resulting VideoLUT entries that get created from them.
...which is actually a very important detail. In other words, the
normal pixel path cannot be used for calibration, because it won't
usually have enough precision: the VideoLUT output has more bits than
the buffer pixels have.
So this is why you keep insisting that applications need to have access
to the VideoLUT. Finally.
However, controlling the output values does not imply access to the
VideoLUT - it's just the only way you have had so far.
If I understand right, the calibrating or monitor profiling process
(are these the same thing?) needs to control the "raw" pixel values
going through the encoder/connector (DRM terminology), hence you need
access to the /last/ VideoLUT in the pipeline before the monitor. Right?
Or not even a VideoLUT per se, you just want to control the values to
the full precision the hardware has.
How does the profiling work? I mean, what kind of patterns do you show
on the monitor? All pixels always a uniform value? Or just some varying
areas? Individual pixels? Patterns that are not of uniform color?
If it was enough to just light up all pixels of a monitor with one
specific color value at a time, we could pretty easily define a
calibration protocol that instead of using buffers and surfaces, you
would just tell which values to emit to the monitor. Then the
compositor, which is in charge of the hardware pipeline, can do the
right thing. We could encode the values in e.g. 32-bits per channel or
whatever you like, and there could be a provision for the compositor to
report actual number of bits used.
Plus all the needed guarantees of non-interfering like we discussed in
the other email, and an ack from the compositor when the new value has
actually reached the monitor.
I would argue that it is much easier to make the above work reliably
than craft a buffer of pixels filled with certain values, then tell the
compositor to program the hardware to (not) mangle the values in a
certain way, and assume the output is something you wanted. The
application would not even know what manipulation stages the compositor
and the hardware might have for the pixels, so you would still need a
protocol to say "I want everything to be identity except for the last
LUT in the pipeline". IMO that is a hell of a hard way of saying
"output this value to the monitor".
> And let me raise a fundamental point about profiling here
> (not to be confused with calibration). Profiling the display will not
> work if the color values of the pixels to the display is different during
> profiling, to what it is for normal application display.
Right.
(What is the difference between calibrating and profiling?)
In the scheme above, there would indeed be very different paths for
profiling vs. normal usage. But I do think that is how it has to be,
they will always be different: normal usage will not have the
opportunity to change the VideoLUT at will.
You can still ensure the compositor works correctly. After you have
profiled the monitor, configured the compositor to use the new
profiles, you can use the normal usage path to show a test image and
verify that the colorimeter agrees.
I think one would want to do the verification step anyway, and with
various different content color... um, definitions(?) to see that the
compositor does indeed work correctly for more than one case.
I recall demands from earlier that there must be a "pass-through mode"
for pixels so that calibration apps can work. I think the design
described above provides that even better. "Pass-through mode" by
definition is a path different from the normal usage, too.
If you would agree to all this, then normal usage and profiling would
really be separate things and could be designed independently and to
the point.
Thanks,
pq
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 801 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20170113/e6be4ba6/attachment.sig>
More information about the wayland-devel
mailing list