[RFC PATCH v3 1/6] drm/doc: Color Management and HDR10 RFC
Pekka Paalanen
ppaalanen at gmail.com
Thu Sep 23 08:01:30 UTC 2021
On Wed, 22 Sep 2021 11:06:53 -0400
Harry Wentland <harry.wentland at amd.com> wrote:
> On 2021-09-20 20:14, Harry Wentland wrote:
> > On 2021-09-15 10:01, Pekka Paalanen wrote:> On Fri, 30 Jul 2021 16:41:29 -0400
> >> Harry Wentland <harry.wentland at amd.com> wrote:
> >>
>
> <snip>
>
> >>> +If a display's maximum HDR white level is correctly reported it is trivial
> >>> +to convert between all of the above representations of SDR white level. If
> >>> +it is not, defining SDR luminance as a nits value, or a ratio vs a fixed
> >>> +nits value is preferred, assuming we are blending in linear space.
> >>> +
> >>> +It is our experience that many HDR displays do not report maximum white
> >>> +level correctly
> >>
> >> Which value do you refer to as "maximum white", and how did you measure
> >> it?
> >>
> > Good question. I haven't played with those displays myself but I'll try to
> > find out a bit more background behind this statement.
> >
>
>
> Some TVs report the EOTF but not the luminance values.
> For an example edid-code capture of my eDP HDR panel:
>
> HDR Static Metadata Data Block:
> Electro optical transfer functions:
> Traditional gamma - SDR luminance range
> SMPTE ST2084
> Supported static metadata descriptors:
> Static metadata type 1
> Desired content max luminance: 115 (603.666 cd/m^2)
> Desired content max frame-average luminance: 109 (530.095 cd/m^2)
> Desired content min luminance: 7 (0.005 cd/m^2)
>
I forget where I heard (you, Vitaly, someone?) that integrated panels
may not have the magic gamut and tone mapping hardware, which means
that software (or display engine) must do the full correct thing.
That's another reason to not rely on magic display functionality, which
suits my plans perfectly.
> I suspect on those TVs it looks like this:
>
> HDR Static Metadata Data Block:
> Electro optical transfer functions:
> Traditional gamma - SDR luminance range
> SMPTE ST2084
> Supported static metadata descriptors:
> Static metadata type 1
>
> Windows has some defaults in this case and our Windows driver also has
> some defaults.
Oh, missing information. Yay.
> Using defaults in the 1000-2000 nits range would yield much better
> tone-mapping results than assuming the monitor can support a full
> 10k nits.
Obviously.
> As an aside, recently we've come across displays where the max
> average luminance is higher than the max peak luminance. This is
> not a mistake but due to how the display's dimming zones work.
IOW, the actual max peak luminance in absolute units depends on the
current image average luminance. Wonderful, but what am I (the content
producer, the display server) supposed to do with that information...
> Not sure what impact this might have on tone-mapping, other than
> to keep in mind that we can assume that max_avg < max_peak.
*cannot
Seems like it would lead to a very different tone mapping algorithm
which needs to compute the image average luminance before it can
account for max peak luminance (which I wouldn't know how to infer). So
either a two-pass algorithm, or taking the average from the previous
frame.
I imagine that is going to be fun considering one needs to composite
different types of input images together, and the final tone mapping
might need to differ for each. Strictly thinking that might lead to an
iterative optimisation algorithm which would be quite intractable in
practise to complete for a single frame at a time.
Thanks,
pq
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/amd-gfx/attachments/20210923/7f9fb1ab/attachment-0001.sig>
More information about the amd-gfx
mailing list