[RFC wayland-protocols v2 1/1] Add the color-management protocol

Chris Murphy lists at colorremedies.com
Fri Mar 1 04:05:13 UTC 2019


On Thu, Feb 28, 2019 at 4:37 AM Pekka Paalanen <ppaalanen at gmail.com> wrote:
>
> another thought about a compositor implementation detail I would like
> to ask you all is about the blending space.
>
> If the compositor blending space was CIE XYZ with direct (linear)
> encoding to IEEE754 32-bit float values in pixels, with the units of Y
> chosen to match an absolute physical luminance value (or something that
> corresponds with the HDR specifications), would that be sufficient for
> all imaginable and realistic color reproduction purposes, HDR included?

CIE XYZ doesn't really have per se limits. It's always possible to
just add more photons, even if things start catching fire.

You can pick sRGB/Rec.709 primaries and define points inside or
outside those primaries, with 32-bit FP precision. This was the
rationalization used in the scRGB color space.
https://en.wikipedia.org/wiki/ScRGB

openEXR assumes Rec.709 primaries if not specified, but quite a bit
more dynamic range than scRGB.
http://www.openexr.com/documentation/TechnicalIntroduction.pdf
http://www.openexr.com/documentation/OpenEXRColorManagement.pdf

An advantage to starting out with constraint, you can much more easily
implement lower precision levels, like 16bpc float or even integer.

> Or do I have false assumptions about HDR specifications and they do
> not define brightness in physical absolute units but somehow in
> relative units? I think I saw "nit" as the unit somewhere which is an
> absolute physical unit.

It depends on which part of specifications you're looking at. The
reference environment, and reference medium are definitely defined in
absolute terms. The term "nit" is the same thing as the candela per
square meter (cd/m^2), and that's the unit for luminance. Display
black luminance and white luminance use this unit. The environment
will use the SI unit lux. The nit is used for projected light, and lux
used for light incident to or emitted from a surface (ceiling, walls,
floor, etc).

In the SDR world including an ICCv4 world, the display class profile
uses relative values: lightness. Not luminance. Even when encoding
XYZ, the values are all relative to that display's white, where Y =
1.0. So yeah for HDR that information is useless and is one of the
gotchas with ICC display class profiles. There are optional tags
defined in the spec for many years now to include measured display
black and white luminance. For HDR applications it would seem it'd
have to be required information. Another gotcha that has been mostly
sorted out I think, is whether the measurements are so called
"contact" or "no contact" measurements, i.e. a contact measurement
won't account for veiling glare, which is the effect of ambient light
reflecting off the surface of the display thereby increasing the
effective display's black luminance. A no contact measurement will
account for it. You might think, the no contact measurement is better.
Well, yeah, maybe in a production environment where everything is
measured and stabilized.

But in a home, you might actually want to estimate veiling glare and
apply it to a no contact display black luminance measurement. Maybe
you have a setting in a player with simple ambient descriptors as
"dark" "moderate" "bright" amounts of ambient condition. The choices
made for handling HDR content in such a case are rather substantially
different. And if this could be done by polling an inexpensive sensor
in the environment, for example a camera on the display, so much the
better. Maybe.

> It might be heavy to use, both storage wise and computationally, but I
> think Weston should start with a gold standard approach that we can
> verify to be correct, encode the behaviour into the test suite, and
> then look at possible optimizations by looking at e.g. other blending
> spaces or opportunistically skipping the blending space.
>
> Would that color space work universally from the colorimetry and
> precision perspective, with any kind of gamut one might want/have, and
> so on?

The compositor is doing what kind of blending for what purpose? I'd
expect any professional video rendering software will do this in their
own defined color space, encoding, and precision - and it all happens
internally. It might be a nice API so that applications don't have to
keep reinventing that particular wheel and doing it internally.

In the near term do you really expect you need blending beyond
Rec.2020/Rec.2100? Rec.2020/Rec.2100 is not so big that transforms to
Rec.709 will require special gamut mapping consideration. But I'm open
to other ideas.

Blender, DaVinci, Lightworks, GIMP or GEGL, and Darktable folks might
have some input here.

-- 
Chris Murphy


More information about the wayland-devel mailing list