[RFC wayland-protocols v2 0/1] Color Management Protocol

Erwin Burema e.burema at gmail.com
Thu Mar 14 12:27:45 UTC 2019


On Thu, 14 Mar 2019 at 12:29, Pekka Paalanen <ppaalanen at gmail.com> wrote:
>
> On Wed, 13 Mar 2019 15:50:40 +0100
> Erwin Burema <e.burema at gmail.com> wrote:
>
> > Hi,
> > On Wed, 13 Mar 2019 at 10:06, Pekka Paalanen <ppaalanen at gmail.com> wrote:
> > >
> > > On Tue, 12 Mar 2019 17:02:53 +0100
> > > Erwin Burema <e.burema at gmail.com> wrote:
> > >
> > > > Hi,
> > > >
> > > > Comments inline
> > > > On Tue, 12 Mar 2019 at 14:01, Pekka Paalanen <ppaalanen at gmail.com> wrote:
> > > > >
> > > > > On Thu, 7 Mar 2019 12:37:52 +0100
> > > > > Erwin Burema <e.burema at gmail.com> wrote:
> > > > >
> > > > > > Wed Mar 6 17:09:27 UTC 2019 Sebastian Wick :
> > > > > > >...
> > > > > >
> > > > > > > 2. The whole pipeline should look something like
> > > > > > >
> > > > > > >   [surface cs] -cs conversion-> [output cs] -tone mapping-> [output cs]
> > > > > > >    -degamma-> [output linear cs] -blending-> [output linear cs] -gamma->
> > > > > > >    [output cs].
> > > > > > >
> > > > > > >    Where some parts can be skipped if e.g. surface cs == output cs or
> > > > > > >    surface and output are SDR.
> > > > > >
> > > > > > For expert/pro applications this is probably indeed the pipline
> > > > > > needed, the only thing that troubles me here is, is that there no
> > > > > > guarente that an output color space is well behaved so might still
> > > > > > contain some non-linearities or other issues after the degamma
> > > > > > process[1], so for non-expert/non-pro stuff that still cares to some
> > > > > > extent for color a better pipeline might be
> > > > > > [surface cs] -cs conversion-> [blending scene referred cs] -blending->
> > > > > > [blending scene referred cs] -tone mapping-> [blending output referred
> > > > > > cs] -cs conversion-> [output cs]
> > > > > > Of course since this requires two cs conversion steps this is not
> > > > > > ideal for anything pro but the only blending operation that is
> > > > > > guaranteed to be artifact free in output cs is bit blitting which is
> > > > > > not something we want to limit compositors to.
> > > > >
> > > > > Hi Erwin,
> > > > >
> > > > > did you consider that the blending we talk about here is only alpha
> > > > > blending with alpha in [0, 1] and 1-alpha coefficients and nothing
> > > > > else? I forget who recently noted that alpha blending shouldn't cause
> > > > > any problems since mathematically the result cannot exceed the source
> > > > > space.
> > > > >
> > > > > Could you elaborate on the potential problems with the proposed
> > > > > pipeline more?
> > > > >
> > > >
> > > > Yes alpha blending can't exceed color space but can lead to color
> > > > artifacts (if you ever edited/created a picture in sRGB and saw some
> > > > dark/black borders that is one possible effect)
> > > >
> > > > Some of the problems of blending colors in non-linear spaces are
> > > > described here:
> > > > https://ninedegreesbelow.com/photography/linear-gamma-blur-normal-blend.html
> > > > (not my site, has lots of pictures which hopefully makes the
> > > > explanation a bit better)
> > > >
> > > > Theoretically this should be solved by degamma but there is no
> > > > guarantee that after that operation a profile is well behaved (see
> > > > https://ninedegreesbelow.com/photography/well-behaved-profile.html),
> > > > it might be neither withe balanced nor normalized, in which case you
> > > > will get similar artifacts as if working in a non-linear space.
> > > >
> > > > > Do you mean that the degamma mapping computed from the output color
> > > > > profile might be so far off, that alpha blending would produce a
> > > > > visibly wrong color on the monitor?
> > > > >
> > > >
> > > > Yes, see above, also not sure if a degamma is always possible
> > > > (especially with LUT based profiles) Graeme Gill can probably give a
> > > > more complete answer here. Although this will be a bigger issue on
> > > > cheap consumer monitors then expensive pro-ones.
> > > >
> > > >
> > > > > I would lean on taking that risk though, to have support for "pro
> > > > > applications". People would definitely be upset if "pro apps" were not
> > > > > properly supported, while alpha blending is usually just for visual
> > > > > special effects like rounded corners or fade-in/out animations and
> > > > > other less important window system gimmicks. I also do not see the
> > > > > latter important enough to warrant implementing both ways in a
> > > > > compositor.
> > > > >
> > > > If you can be sure alpha blending is only used for effects and nothing
> > > > else this is an acceptable trade off but blending HDR and non-HDR
> > > > content should probably be done in a scene referred color space which
> > > > the output one isn't (even if it is degammaed) which means that in
> > > > some cases you might need/want the second way anyway, that said
> > > > pro-applications that deal with both HDR and non-HDR content should
> > > > probably take care of the critical parts of the blending itself and
> > > > output in HDR (so only the user interface would then be non-HDR)
> > >
> > > Hi Erwin,
> > >
> > > the problem with the alternate color pipeline you proposed is that it
> > > has two color space transformations. It is the pipeline I originally
> > > assumed would be needed, but then Graeme and Chris convinced me
> > > otherwise, and I was happy because it simplified things.
> > >
> >
> > That is why we need both pipelines one for the non-critical path in
> > which case we can alpha blend and one for the critical path in which
> > case we can only blit.
>
> Hi Erwin,
>
> IOW, for the color-critical path you would forbid the use of pixel
> formats with an alpha channel?
>
Yes indeed

> > I would like to note that from a color
> > management view gamma/degamma is also a color transform (a linear
> > space with the sRGB primaries is not the same thing as sRGB), although
> > one that is in many cases so easily revertible that we often forget
> > that this is the case.
>
> Yes, I know that sRGB is not sRGB-but-with-linear-light pixel values.
> However, I lack the knowledge and good terminology to refer to an
> "otherwise all the same except for the light level to pixel value
> mapping" color space. And even then I'd consider the pixel value to bit
> pattern (e.g. half-float, uint16, ...) mapping to be yet another
> orthogonal aspect (encoding?).
>

Most commonly used is linear vs non-linear and yes the exact bit
values (uint8, uint16, half-float and float are the most common) are
related to encoding and not so much with colorspace, for a colorspace
you often treat everything like a float with either <0.0, 1.0>
(display referred) or <0.0, +inf> (scene referred) ranges

> > > If we have the pipeline with a really separate blending space and need
> > > two different color space transformations, at least the question of how
> > > to pick the right intents for each comes up if I understood right.
> > >
> >
> > Yeah that is indeed a problem
>
> I'd very much prefer if we didn't have to solve that problem.
>

I suspect in practice the blending to output will have a constant
intent that might be user selectable and the only intent that a
program can set will be their space to blend space.

> > > Graeme, Chris, what do you think?
> > >
> > > Excellent web links by the way!
> > >
> >
> > Thanks, Elle Stone did a lot of work regarding color management for
> > GIMP which she noted down on her site, there is a lot more there that
> > might be of interest!
> >
> > Adding to this all I would like to note that although I think that for
> > the best possible output we need both a  critical and non-critical
> > color path I personally can live with some wonky alpha transparencies
> > considering that most of the time those should be limited to effects,
> > but I would hardly call that every frame perfect ;)
>
> Did the non-linearities from degamma not result purely because the
> output color profile was not good enough? That is, the ICC file is
> missing some optional parts or the data in them is not actually correct.
>

No it comes from shitty cheap monitors, alas I am afraid that
explaining all this to the people who use those kind of monitors will
not be easy

> We could certainly put some requirements on output color profiles that
> a compositor will accept, though I think those should be compositor
> policy, not protocol specification.
>
Indeed, but as I said above it mostly has to do that not everybody can
afford to spend at least 500usd on a monitor

> This may seem like finger-pointing, but if a compositor is given a less
> than perfect ICC profile for an output, there is no way it could ever
> make any frame perfect. Attempting to second-guess the profile will be
> futile in any case. I would not include the completely different
> blending space path, because it is trying to work around flawed data
> (the output color profile) in a way that will be wrong if the data was
> perfect.
>
> Weston has a policy to not work around driver bugs. I feel I can lump
> an inaccurate ICC file in the same heap.
>

Not inaccurate just cheap monitors (it is accurate for the cheap
monitor it is just that cheap monitors have a tendency to be rather
wonky color wise), although you might consider that a HW bug

> How hard is it to create an ICC profile that will achieve a truly
> light-linear space after degamma, up to humanly observable accuracy for
> alpha-blending purposes? Can you do that with almost any commercial
> measurement device including ColorHug, or would it need some expensive
> special equipment?
>

More expensive monitor and any comercially available
profiler/calibrator should do the trick

>
> Thanks,
> pq

Regards,

Erwin


More information about the wayland-devel mailing list