[RFC PATCH 0/3] A drm_plane API to support HDR planes

Shashank Sharma shashank.sharma at amd.com
Wed Apr 28 07:54:27 UTC 2021


Hello Harry,

Many of us in the mail chain have discussed this before, on what is the right way to blend and tone map a SDR and a HDR buffer from same/different color spaces, and what kind of DRM plane properties will be needed.

As you can see from the previous comments, that the majority of the decision making will happen in the Compositor, as it's the only SW unit, which has the overall picture clear.

Reference: (https://lists.freedesktop.org/archives/wayland-devel/2019-January/039808.html )

If we see a systematic approach of how do we make such blending policy, it will look like:


- Compositor needs to understand the following values of each of the buffer:

    - Color space or Gamut: BT2020/SRGB/DCI-P3/BT709/BT601 etc

    - Color format (RGB/YCBCR) and subsampling (444/422/420)

    - Tone (SDR/HDR_A/HDR_B)


- Then the Compositor needs to understand the capabilities of the output display, as this will be a clamping value

    - Output Gamut support (BT2020/SRGB/DCIP3)

    - Output max Luminance of the monitor in Nits (even in case of HDR content to HDR display)

  

Based of all this information above, the compositor needs to set a blending target, which contains the following:

    - Output Colorspace of the blended output: say BT2020

    - Output Luminance of the blended output: Match content, if monitor can support it

    - Output Color format of the blended output: Say YCBCR4:2:0


Let's assume compositor prepares a blending policy with output as:

    - Output Luminance: HDR 500 Nits

    - Output color space: BT2020

    - Output color format: RGB888

    - Output curve: ST2084

  

Assuming these details, A compositor will look for DRM color properties like these:

1. Degamma plane property : To make buffers linear for Gamut mapping

2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 colorspace

3. Color space conversion plane property: To convert from YCBCR->RGB

4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer H2H

5. Gamma plane/CRTC property: to re-apply the output ST2084 curve


We will also need connector/CRTC properties to set AVI info-frames accordingly.

A high level block diagram for blending on a generic HW should look like this:

/*
 *  SDR 200Nits┌────────────────┐ SDR 200 Nits  ┌────────────────┐ SDR 200 ┌──────────────────┐HDR 500┌────────────────┐ HDR 500
 *   BT709     │                │ BT709         │                │ BT2020  │                  │BT2020 │                │ BT2020
 *   ────────► │   Degamma      ├─────────────► │ Gamut Mapping  ├────────►│  Tone mapping    ├──────►│  Gamma         │
 *  RGB888     │     2.2        │ RGB888        │  709->2020     │ RGB888  │    S2H           │RGB888 │  ST2084        │ RGB888
 *  Non Linear │                │ Linear        │                │ Linear  │   200->500       │Linear │                │ ST2084
 *             └────────────────┘               └────────────────┘         └──────────────────┘       └────────────────┘
 *
 *
 *
 *
 *
 *
 *
 *
 *             ┌─────────────────┐             ┌─────────────────┐           ┌─────────────────┐       ┌────────────────┐
 * HDR 600 Nits│                 │HDR 600 Nits │                 │HDR600     │                 │HDR500 │                │ HDR500
 *   ────────► │  Degamma        ├────────────►│  Color space    ├──────────►│  Tone mapping   ├──────►│  Gamma         │
 * BT2020      │  OETF ST2084    │ BT2020      │  conversion     │BT2020     │   H2H           │BT2020 │  ST2084        │ BT2020
 * YCBCR420    │                 │ YCBCR420    │ YCBCR->RGB      │RGB88      │   600->500      │RGB888 │                │ RGB888
 * Non Linear  └─────────────────┘ Linear      └─────────────────┘Linear     └─────────────────┘Linear └────────────────┘ ST2084
 */


Hope this helps to refine the series.


Regards

Shashank

On 27/04/21 20:20, Pekka Paalanen wrote:
> On Mon, 26 Apr 2021 13:38:49 -0400
> Harry Wentland <harry.wentland at amd.com> wrote:
>
>> ## Introduction
>>
>> We are looking to enable HDR support for a couple of single-plane and
>> multi-plane scenarios. To do this effectively we recommend new
>> interfaces to drm_plane. Below I'll give a bit of background on HDR
>> and why we propose these interfaces.
>>
>>
>> ## Defining a pixel's luminance
>>
>> Currently the luminance space of pixels in a framebuffer/plane
>> presented to the display is not well defined. It's usually assumed to
>> be in a 2.2 or 2.4 gamma space and has no mapping to an absolute
>> luminance value but is interpreted in relative terms.
>>
>> Luminance can be measured and described in absolute terms as candela
>> per meter squared, or cd/m2, or nits. Even though a pixel value can
>> be mapped to luminance in a linear fashion to do so without losing a
>> lot of detail requires 16-bpc color depth. The reason for this is
>> that human perception can distinguish roughly between a 0.5-1%
>> luminance delta. A linear representation is suboptimal, wasting
>> precision in the highlights and losing precision in the shadows.
>>
>> A gamma curve is a decent approximation to a human's perception of
>> luminance, but the PQ (perceptual quantizer) function [1] improves on
>> it. It also defines the luminance values in absolute terms, with the
>> highest value being 10,000 nits and the lowest 0.0005 nits.
>>
>> Using a content that's defined in PQ space we can approximate the
>> real world in a much better way.
>>
>> Here are some examples of real-life objects and their approximate
>> luminance values:
>>
>> | Object            | Luminance in nits |
>> | ----------------- | ----------------- |
>> | Sun               | 1.6 million       |
>> | Fluorescent light | 10,000            |
>> | Highlights        | 1,000 - sunlight  |
>> | White Objects     | 250 - 1,000       |
>> | Typical objects   | 1 - 250           |
>> | Shadows           | 0.01 - 1          |
>> | Ultra Blacks      | 0 - 0.0005        |
>>
>>
>> ## Describing the luminance space
>>
>> **We propose a new drm_plane property to describe the Eletro-Optical
>> Transfer Function (EOTF) with which its framebuffer was composed.**
>> Examples of EOTF are:
>>
>> | EOTF      | Description                                                               |
>> | --------- |:------------------------------------------------------------------------- |
>> | Gamma 2.2 | a simple 2.2 gamma                                                        |
>> | sRGB      | 2.4 gamma with small initial linear section                               |
>> | PQ 2084   | SMPTE ST 2084; used for HDR video and allows for up to 10,000 nit support |
>> | Linear    | Linear relationship between pixel value and luminance value               |
>>
> The definitions agree with what I have learnt so far. However, with
> these EOTF definitions, only PQ defines absolute luminance values
> while the others do not. So this is not enough information to blend
> planes together if they do not all use the same EOTF with the same
> dynamic range. More below.
>
>
>> ## Mastering Luminances
>>
>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>> pixels in absolute terms. Unfortunately we're again presented with
>> physical limitations of the display technologies on the market today.
>> Here are a few examples of luminance ranges of displays.
>>
>> | Display                  | Luminance range in nits |
>> | ------------------------ | ----------------------- |
>> | Typical PC display       | 0.3 - 200               |
>> | Excellent LCD HDTV       | 0.3 - 400               |
>> | HDR LCD w/ local dimming | 0.05 - 1,500            |
>>
>> Since no display can currently show the full 0.0005 to 10,000 nits
>> luminance range the display will need to tonemap the HDR content, i.e
>> to fit the content within a display's capabilities. To assist with
>> tonemapping HDR content is usually accompanied with a metadata that
>> describes (among other things) the minimum and maximum mastering
>> luminance, i.e. the maximum and minimum luminance of the display that
>> was used to master the HDR content.
>>
>> The HDR metadata is currently defined on the drm_connector via the
>> hdr_output_metadata blob property.
>>
>> It might be useful to define per-plane hdr metadata, as different
>> planes might have been mastered differently.
> I don't think this would directly help with the dynamic range blending
> problem. You still need to establish the mapping between the optical
> values from two different EOTFs and dynamic ranges. Or can you know
> which optical values match the mastering display maximum and minimum
> luminances for not-PQ?
>
>
>> ## SDR Luminance
>>
>> Since SDR covers a smaller luminance range than HDR, an SDR plane
>> might look dark when blended with HDR content. Since the max HDR
>> luminance can be quite variable (200-1,500 nits on actual displays)
>> it is best to make the SDR maximum luminance value configurable.
>>
>> **We propose a drm_plane property to specfy the desired maximum
>> luminance of the SDR plane in nits.** This allows us to map the SDR
>> content predictably into HDR's absolute luminance space.
> What would be the mapping? Simple linear scaling? A more complicated
> tone mapping?
>
> Rather than "SDR luminance", do you perhaps intend this to configure
> the dynamic range of the non-absolute-luminance EOTFs?
> In that case maybe you'd need a black luminance level too?
>
>
>> ## Let There Be Color
>>
>> So far we've only talked about luminance, ignoring colors altogether.
>> Just like in the luminance space, traditionally the color space of
>> display outputs has not been well defined. Similar to how an EOTF
>> defines a mapping of pixel data to an absolute luminance value, the
>> color space maps color information for each pixel onto the CIE 1931
>> chromaticity space. This can be thought of as a mapping to an
>> absolute, real-life, color value.
>>
>> A color space is defined by its primaries and white point. The
>> primaries and white point are expressed as coordinates in the CIE
>> 1931 color space. Think of the red primary as the reddest red that
>> can be displayed within the color space. Same for green and blue.
>>
>> Examples of color spaces are:
>>
>> | Color Space | Description                                |
>> | ----------- | ------------------------------------------ |
>> | BT 601      | similar to BT 709                          |
>> | BT 709      | used by sRGB content; ~53% of BT 2020      |
>> | DCI-P3      | used by most HDR displays; ~72% of BT 2020 |
>> | BT 2020     | standard for most HDR content              |
>>
>> The color space is defined in DRM for YCbCr planes via the
>> color_encoding property of the drm_plane. 
> I don't think that is quite right.
>
> As far I understand, COLOR_ENCODING property controls which matrix is
> used to convert from YCbCr to RGB, but that is all it does. It is not
> used for the actual color space. So while these BT standards do
> specify the chromaticities, they also specify the YCbCr encoding which
> is the part used in this property.
>
> YCbCr and RGB are color models. They are not color spaces. RGB is an
> additive color model while YCbCr is not, AFAIU. Blending requires an
> additive color model and linear luminance encoding.
>
> You need two color space definitions to create one color space
> transformation: source color space and destination color space. You
> also need an idea *how* the two color spaces should be mapped, which is
> called "rendering intent". You can't do anything with just one color
> space definition, except to pass it on along with the pixels.
>
> To be able to blend planes together, all planes need to be converted to
> the same color space first: the blending color space, whatever you
> choose it to be. I do not see where KMS would do this color space
> conversion, or where it would get the definition of the blending color
> space.
>
>> **We propose to add definitions for the RGB variants of the BT color
>> spaces.**
> Therefore I'm not sure this makes sense.
>
>
>> ## Color Primaries and White Point
>>
>> Just like displays can currently not represent the entire 0.0005 -
>> 10,000 nits HDR range of the PQ 2084 EOTF, they are currently not
>> capable of representing the entire BT.2020 color Gamut. For this
>> reason video content will often specify the color primaries and white
>> point used to master the video, in order to allow displays to be able
>> to map the image as best as possible onto the display's gamut.
>>
>>
>> ## Displays and Tonemapping
>>
>> External displays are able to do their own tone and color mapping,
>> based on the mastering luminance, color primaries, and white space
>> defined in the HDR metadata.
>>
>> Internal panels (which are currently few and far between) usually
>> don't include the complex HW to do tone and color mapping on their
>> own and will require the display driver to perform appropriate
>> mapping.
> FWIW, when designing Weston's color management, we are aiming for
> the latter "simple" panels foremost, because that gives us full control
> of all color conversions and tone mappings.
>
> OTOH, if Weston has to present to a display which only accepts e.g.
> BT.2020/PQ signal, the display might always mangle the image in
> unexpected ways. Therefore I expect that by default Weston will do
> everything it can to try to make the display not apply anything magic
> image enhancement: trust that EDID description of the display gamut and
> dynamic range are correct, and use HDR metadata to tell the display
> that those values are exactly what we are using. And we use them.
>
> IMO, a display doing its tone mapping magically is only useful when you
> want to be able to use "simple" playback devices that cannot adapt to
> the display they are driving. Magic tone mapping is also a way for
> hardware vendors to differentiate, which from the color management
> perspective is harmful as it makes it more difficult or impossible to
> predict the display behaviour or to keep it consistent.
>
> So there are two opposing goals:
>
> - Traditional color management wants absolute control of the display,
>   leaving nothing unpredictable and preferably also nothing undefined.
>   Undefined behaviour can always be measured (profiled) which makes it
>   predictable and known. The viewing environment is controlled and
>   constant.
>
> - Entertainment wants the most visually impressive image quality by
>   dynamically adapting to both displayed content and to the viewing
>   environment conditions.
>
>> ## Pixel Formats
>>
>> The pixel formats, such as ARGB8888, ARGB2101010, P010, or FP16 are
>> unrelated to color space and EOTF definitions. HDR pixels can be
>> formatted in different ways but in order to not lose precision HDR
>> content requires at least 10 bpc precision. For this reason
>> ARGB2101010, P010, and FP16 are the obvious candidates for HDR.
>> ARGB2101010 and P010 have the advantage of requiring only half the
>> bandwidth as FP16, while FP16 has the advantage of enough precision
>> to operate in a linear space, i.e. without EOTF.
> Right.
>
>> ## Proposed use-cases
>>
>> Although the userspace side of this work is still in the early stages
>> it is clear that we will want to support the following two use-cases:
>>
>> **One XRGB2101010 HDR Plane:** A single, composited plane of HDR
>> content. The use-case is a video player on a desktop with the
>> compositor owning the composition of SDR and HDR content. The content
>> shall be PQ BT.2020 formatted. The drm_connector's
>> hdr_output_metadata shall be set.
> This use case is already possible, right?
>
>> **One ARGB8888 SDR Plane + One P010 HDR Plane:** A normal 8bpc
>> desktop plane, with a P010 HDR video plane underlayed. The HDR plane
>> shall be PQ BT.2020 formatted. The desktop plane shall specify an SDR
>> boost value. The drm_connector's hdr_output_metadata shall be set.
> This use case requires blending in KMS, so is the primary goal I
> suppose.
>
>> **One XRGB8888 SDR Plane - HDR output:** In order to support a smooth
>> transition we recommend an OS that supports HDR output to provide the
>> hdr_output_metadata on the drm_connector to configure the output for
>> HDR, even when the content is only SDR. This will allow for a smooth
>> transition between SDR-only and HDR content. In this use-case the SDR
>> max luminance value should be provided on the drm_plane.
> I think this might be already possible by crafting a CRTC GAMMA LUT? Not
> sure about precision.
>
>> In DCN we will de-PQ or de-Gamma all input in order to blend in
>> linear space. For SDR content we will also apply any desired boost
>> before blending. After blending we will then re-apply the PQ EOTF and
>> do RGB to YCbCr conversion if needed.
> This assumes the same color space over everything.
>
>> ## Summary of proposed interface changes
>>
>> per drm_plane:
>> - new RGB color space definitions, mirroring the existing YUV color
>> space definitions
>> - new transfer function property
>> - new SDR maximum white level property
> How will these new KMS properties interact with per-plane DEGAMMA, CTM
> and/or GAMMA properties?
>
> Why go with your proposal instead of per-plane CTM and LUT?
>
> I think the ideal KMS pipeline for me, assuming I cannot have 3D LUTs
> both per-plane and on CRTC, would be:
>
> plane:
> 	FB -> M1 -> LUT1 -> M2 -> blending input
>
> CRTC:
> 	blending output -> LUT2 -> M3 -> connector
>
> FB: framebuffer
> M1: matrix transform, capable of converting e.g. YCbCr to RGB
> LUT1: 1D LUT for content EOTF, to produce light-linear RGB
> M2: matrix transform for color space transformation
>
> LUT2: 1D LUT for applying monitor EOTF^-1
> M3: matrix transform, e.g. if you need to push YCbCr on the connector
>
> We also need to know where and how clipping happens.
>
> I think this scheme would allow implementing everything you want, and
> it would not be tied to rigid enumerations, and it won't have any
> magical conversions done under the hood as you would need to do to
> convert from one enum space to another. It leaves the render intent to
> be defined by the userspace compositor, rather than building a fixed
> policy in the kernel.
>
> Userspace would be setting transformation operators, not color spaces,
> to the kernel, allowing the blending space to be chosen by userspace.
> In Weston we aim to choose then blending color space to be the same as
> the output color space, except in optical (linear) encoding. The output
> color space can be non-standard, e.g. measured with a display profiler
> equipment.
>
> I would expect gamut mapping, dynamic range mapping and tone mapping to
> be places where most experimentation and innovation happens, so
> implementing them in the kernel with just few or no controllable
> parameters at this time seems like it could become useless fast.
>
>
> Thanks,
> pq
>
>> ## References
>>
>> [1]
>> https://en.wikipedia.org/wiki/High-dynamic-range_video#Perceptual_Quantizer
>>
>>
>> ## Further Reading
>>
>> https://gitlab.freedesktop.org/swick/wayland-protocols/-/blob/color/unstable/color-management/color.rst
>> http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP309.pdf
>> https://app.spectracal.com/Documents/White%20Papers/HDR_Demystified.pdf
>>
>>
>> Bhawanpreet Lakha (3):
>>   drm/color: Add RGB Color encodings
>>   drm/color: Add Color transfer functions for HDR/SDR
>>   drm/color: Add sdr boost property
>>
>>  .../gpu/drm/amd/display/amdgpu_dm/amdgpu_dm.c |  4 +-
>>  .../gpu/drm/arm/display/komeda/komeda_plane.c |  4 +-
>>  drivers/gpu/drm/arm/malidp_planes.c           |  4 +-
>>  drivers/gpu/drm/armada/armada_overlay.c       |  4 +-
>>  drivers/gpu/drm/drm_atomic_uapi.c             |  8 ++
>>  drivers/gpu/drm/drm_color_mgmt.c              | 84
>> +++++++++++++++++-- drivers/gpu/drm/i915/display/intel_sprite.c   |
>> 4 +- .../drm/i915/display/skl_universal_plane.c    |  4 +-
>>  drivers/gpu/drm/nouveau/dispnv04/overlay.c    |  4 +-
>>  drivers/gpu/drm/omapdrm/omap_plane.c          |  4 +-
>>  drivers/gpu/drm/sun4i/sun8i_vi_layer.c        |  4 +-
>>  drivers/gpu/drm/tidss/tidss_plane.c           |  6 +-
>>  include/drm/drm_color_mgmt.h                  | 25 +++++-
>>  include/drm/drm_plane.h                       | 30 +++++++
>>  14 files changed, 173 insertions(+), 16 deletions(-)
>>


More information about the amd-gfx mailing list