Steam Deck integrated display: saturation boosting or reduction?

Pekka Paalanen ppaalanen at gmail.com
Tue Nov 14 10:09:18 UTC 2023


On Tue, 7 Nov 2023 14:28:13 +0000
Joshua Ashton <joshua at froggi.es> wrote:

> On 11/6/23 10:21, Pekka Paalanen wrote:
> > On Sat, 4 Nov 2023 13:11:02 +0000
> > Joshua Ashton <joshua at froggi.es> wrote:
> >   
> >> Hello,  
> > 
> > Hi Joshua,
> > 
> > thanks for replying. The reason I started this discussion is that I
> > would like us to come to the same page on terminology and what the math
> > actually means. I cannot and do not want to claim anything wrong in
> > your math or implementation, but I would like to describe it from a
> > certain point of view.  
> 
> Sure, I can see your perspective -- it really depends which way you are 
> looking at the color pipeline.
> 
> Speaking strictly from a math perspective we do a CTM of Hypothetical 
> Display -> Native Primaries near the grey axis and smoothly decrease the 
> influence of that operation near gamut edges to get the image we put on 
> screen.
> 
> >   
> >> The existing behaviour before any of our colour work was that the native
> >> display's primaries were being used for SDR content. (Ie. just scanning
> >> out game's buffer directly)  
> > 
> > Ok. That means it was using an implicit conversion from whatever the
> > games were emitting into the display's native primaries. Using an
> > (implicit) identity CTM unavoidably changes colorimetry when the source
> > and destination color spaces differ.  
> 
> Yeah.
> 
> If you're looking at the bigger picture with that implicit conversion on 
> the display side, you can say it's actually saturation reduction from 
> our hypothetical display to fit onto our modest gamut internal display.
> 
> But...
> 
> If you look at the pixels we are sending to the display compared to what 
> we were before -- we're increasing the saturation non-linearly, and 
> that's also the result that the users will see.

That kind of reference never even occurred to me. I didn't realize
there was a "before" for your WCG/HDR support.

You're not describing your color pipeline (from input to output),
you're comparing the results from before to now.

Alright.

> The thing we are changing here with this work is what that "hypothetical 
> display" is.
> Previously it was the same as the internal display, but we changed that 
> in order to achieve a saturation boost to compensate for the modest 
> gamut of the display.

I'm confused by this statement, because what the source and destination
displays are is orthogonal to the color adjustments you want to do.
Both can be parametrised and then controlled by the same variables, of
course.

Quoting from https://color.org/ICC_white_paper_9_workflow.pdf :

"The first, coordinate transformation, relates device color code values
to colorimetric code values in the PCS. The second, color rendering or
color re-rendering, changes the colorimetry of an original to be better
suited for some particular reproduction medium."

In other words, we have color coordinate transformations that are
directly determined by the source and destination color space
definitions, and then we have the re-rendering that makes things look
better for the purpose. Both could be squashed into a single
mathematical operation, but conceptually we always need to consider
them separately to avoid confusion.

>  From the other end, if there was a wide gamut display attached to a 
> Deck, and we wanted to display SDR content as 709 on it, I would not 
> call that "saturation boosting" or a "saturation increase" myself.
> Would you?

I don't know what rendering goal you are thinking of.

If there was a wide gamut display connected, then in this day and age
it would likely use BT.2020 signalling if not BT.2100. If I had BT.709
content to show:

- If I wanted to use ICC-absolute rendering intent, I would do the
  colorimetric coordinate transformation from BT.709 to BT.2020 only.

- If I wanted to use media-relative rendering intent, I would do the
  colorimetric coordinate transformation + chromatic adaptation (moot
  here, since white point is the same).

- If I wanted to use perceptual rendering intent, I would probably end
  up doing a slight or moderate saturation boosting on top of the
  colorimetric coordinate transformation + chromatic adaptation,
  because people tend to like more colorfulness I guess. Or give a
  slider like you did.

Moderate boosting would still not reach the limits of BT.2020
at all, so clipping cannot happen unless the monitor itself clips to
something smaller than BT.2020 without our knowledge.

If we are driving a traditional WCG monitor whose signalling uses the
display native primaries, I would need to know the native primaries and
use them instead of BT.2020 in the above.

If you want to look at the numbers, then the BT.709->BT.2020 conversion
pulls all RGB tuples closer to the gray axis. Even though the numbers
change, the colorimetry is exactly the same, so saturation (in absolute
terms) does not change in this step.


> We also use the same SDR Gamut Wideness slider for HDR and Wide Gamut 
> displays too fwiw.
> 
> I also don't believe either perspective is wrong -- it just depends what 
> way you look at it.

Sure. There are ways most people look at things, and then there are
ways only very few people look at things. When someone is trying to say
something from an unexpected point of view, or using words they use to
mean something different than others, discussions tend to heat up even if
everyone would agree. People misinterpret each other. That takes a lot of
effort to understand what was actually said.

Imagine doing linear math with a notation where every matrix and vector
is actually the transpose of what we are used to think about. It's all
still correct if you write equations in reverse (except some might not
do that because they have redefined all operations like matrix
multiplication to work in reverse instead). I think I've seen an API
like that.

Thanks for the explanations below. I think our fundamental
misunderstanding was just the before/after code changes vs. from/to
color spaces.


Thanks,
pq

> > 
> > If games were graded to produce BT.2020 encoding, this would amount to
> > extreme saturation reduction. If games were graded to produce sRGB
> > (perhaps implicitly, like almost all SDR apps do), this would result in
> > some saturation reduction.
> > 
> > When I say "graded", I mean something that can also happen implicitly
> > by a developer thinking: "it looks good on my monitor, and I'd like
> > everyone to see it like I do". The developer may not even think about
> > it. We have no knowledge of what the content was graded for, but it
> > still happened.  
> 
> Yeah, with our "hypothetical display" we are just picking something we 
> think the true intent probably was.
> 
> We actually have no knowledge of it.
> 
> >   
> >> Games are not submitting us any primaries for the buffers they are sending.
> >> I mean they are saying they are sRGB so "technically 709", but
> >> colorimetry for SDR content (outside of mastering) is very wishy-washy.  
> > 
> > Right.
> >   
> >> Deck Display Info:
> >> static constexpr displaycolorimetry_t displaycolorimetry_steamdeck_spec
> >> {
> >> 	.primaries = { { 0.602f, 0.355f }, { 0.340f, 0.574f }, { 0.164f, 0.121f
> >> } },
> >> 	.white = { 0.3070f, 0.3220f },  // not D65
> >> };
> >>
> >> static constexpr displaycolorimetry_t displaycolorimetry_steamdeck_measured
> >> {
> >> 	.primaries = { { 0.603f, 0.349f }, { 0.335f, 0.571f }, { 0.163f, 0.115f
> >> } },
> >> 	.white = { 0.296f, 0.307f }, // not D65
> >> };
> >>
> >> https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.h#L451
> >>
> >> For the rest of this, consider displaycolorimetry_steamdeck_measured to
> >> be what we use for the internal display.
> >>
> >> To improve the rendering of content on the Deck's internal display with
> >> the modest gamut, we go from the display's native primaries (sub 709) to
> >> somewhere between the native primaries (0.0) and a hypothetical wider
> >> gamut display (1.0) that we made up.  
> > 
> > What do you mean by "we go from ... to ..."? What does "go" stand for
> > here?
> > 
> > Do I understand right, that you choose the target primaries from
> > somewhere between the display's native primaries and the below
> > hypothetical display's wider gamut primaries, based on end user
> > vibrance setting?
> > 
> > It's really curious if it indeed is *target* primaries for your color
> > transformation, given that you are still driving the internal display.  
> 
> Bleh, speaking of color stuff in words is always so confusing.
> 
> When I meant go I meant in terms of the effect on the output, not the 
> actual transformation. The actual mathematical transformation on the 
> pixels would be the inverse of what I said for "go".
> 
> Ie. CTM of "Hypothetical display" To XYZ * XYZ To Native Primaries.
> 
> >   
> >> The hypothetical display's primaries were decided based by making
> >> content look appealing:
> >> static constexpr displaycolorimetry_t displaycolorimetry_widegamutgeneric
> >> {
> >> 	.primaries = { { 0.6825f, 0.3165f }, { 0.241f, 0.719f }, { 0.138f,
> >> 0.050f } },
> >> 	.white = { 0.3127f, 0.3290f },  // D65
> >> };
> >>
> >> We have a single knob for this in the UI, in code it's "SDR Gamut
> >> Wideness", but known in the UI as "Color Vibrance". It's the knob that
> >> picks the target color gamut that gets mapped to the native display.
> >>
> >> This is how that single value interacts to pick the target primaries:
> >>
> >> https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.cpp#L798
> >>
> >> We then use the result there to do a simple saturation fit based on the
> >> kob and some additional parameters that control how we interpolate.
> >> (blendEnableMinSat, blendEnableMaxSat, blendAmountMin, blendAmountMax)
> >>
> >> Those parameters also change with the SDR Gamut Wideness value, based on
> >> things that "look nice". :P
> >>
> >> https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.cpp#L769  
> > 
> > Alright, that's another part of the color transformation you do, the
> > one with the most "perceptual intent" in it, I'd say.  
> 
> Yes :P
> 
> >   
> >> We also do some other things like Bradford chromatic adaptation to fix
> >> the slightly-off whitepoint too.
> >>
> >> We use all this to generate a 3D LUT with that saturation fit, chromatic
> >> adaptation and use Shaper + 3D LUT at scanout time to apply it.
> >> (We also have a shader based fallback path)
> >>
> >> The goal of all of this work is less 'color accuracy' and more 'making
> >> the display more inline with consumer expectations'.  
> > 
> > Yes, that is called "perceptual rendering intent" and it is 'color
> > accurate' in that context, assuming your goal is to give the user an as
> > close experience as possible compared to a highly capable display or to
> > the game developer's (mastering) display. You might not know the facts
> > of what it should look like, but you do your best anyway.
> >   
> >> We wanted to try and make the display appear much more 'vivid' and
> >> colourful without introducing horrible clipping.
> >>
> >> We also use this same logic for wider gamut displays (where 0.0 = sRGB
> >> and 1.0 = native) and for SDR content on HDR.
> >>
> >> Hope this helps!  
> > 
> > Very much, but I still have a question: what primaries do you
> > associate with game content pixels?  
> 
> The hypothetical display's.
> 
> > 
> > How is the matrix dest_from_source in
> > https://github.com/ValveSoftware/gamescope/blob/master/src/color_helpers.cpp#L701
> > formed, what are its inputs?  
> 
> Source is what we want for the buffer, ie. the hypothetical display or 
> 2020 for PQ/scRGB content.
> 
> Dest is the native display's primaries (or for HDR10, the container's so 
> 2020).
> We call it "outputEncodingColorimetry".
> 
> - Joshie 🐸✨
> 
> > 
> > This whole email started from you claiming to do saturation boosting,
> > while you are obviously doing saturation reduction in total. Turns out
> > the saturation boosting (yes, you do that) is not the whole color
> > transformation you do. So we're simply talking about different portions
> > of the color transformation and both claims are right, I believe.
> > 
> > 
> > Thanks,
> > pq
> >   
> >> On 11/3/23 13:00, Pekka Paalanen wrote:  
> >>> This is a continuation of
> >>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14#note_2152254
> >>> because this is off-topic in that thread.
> >>>      
> >>>> No, we did widening. The Deck's internal display has a modest gamut
> >>>> that is < 71% sRGB.  
> >>>
> >>> If games do wide (well, full sRGB or wider) gamut, then why would you
> >>> need to make that gamut even wider to fit nicely into a significantly
> >>> smaller gamut display?
> >>>
> >>> Here's what I think happened.
> >>>
> >>> You have a game that produces saturation up to P3, let's say. When you
> >>> did the colorimetrically correct matrix conversion (CTM) from BT.2020
> >>> to the "modest gamut", you found out that it is horribly clipping
> >>> colors, right?
> >>>
> >>> If you then removed that CTM, it means that you are
> >>> re-interpreting BT.2020 RGB encoding *as if* it was "modest gamut" RGB
> >>> encoding. This happens if you simply apply the input image EOTF and
> >>> then apply the display inverse-EOTF and do nothing to the color gamut
> >>> in between. Adjusting dynamic range does not count here. This is an
> >>> extreme case of saturation reduction.
> >>>
> >>> (Note: Doing nothing to numbers equals to applying a major semantic
> >>> operation. Like telling someone something in cm and they take that
> >>> number in mm instead. Or metric vs. imperial units. Color space
> >>> primaries and white point define the units for RGB values, and if you
> >>> have other RGB values, they are not comparable without the proper CTM
> >>> conversion.)
> >>>
> >>> That does not look good either, so after that re-interpretation you
> >>> added saturation boosting that nicely makes use of the capabilities of
> >>> the integrated display's "modest gamut" so that the image looks more
> >>> "vibrant" and less de-saturated. However, the total effect is still
> >>> saturation reduction, because the re-interpretation of the game content
> >>> RGB values is such a massive saturation reduction that your boosting
> >>> does not overcome it.
> >>>
> >>> I could make up an analogue: Someone says they are making all sticks
> >>> 50% longer than what you ask. You ask them to make a stick 100 long.
> >>> They give you a stick that you measure to be 15 long, and they still
> >>> claim it is 50% longer than what you asked. How is this possible? The
> >>> length spaces are different: you were thinking and measuring in cm,
> >>> they did mm. They did give you a stick of 150 mm, which is 50% longer
> >>> than 100 mm. But from your perspective, the stick is 85% smaller than
> >>> you asked. If one had started by converting to a mutual length space
> >>> first (referring to the correct CTM), there would be an initial
> >>> agreement of how long is 1.
> >>>
> >>>
> >>> Thanks,
> >>> pq  
> >>  
> >   

-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20231114/b1403051/attachment.sig>


More information about the wayland-devel mailing list