[RFC PATCH 0/3] A drm_plane API to support HDR planes

Harry Wentland harry.wentland at amd.com
Fri May 14 21:01:33 UTC 2021



On 2021-04-30 6:39 a.m., Shashank Sharma wrote:
> Hello Pekka,
> 
> On 30/04/21 15:13, Pekka Paalanen wrote:
>> On Wed, 28 Apr 2021 13:24:27 +0530
>> Shashank Sharma <shashank.sharma at amd.com> wrote:
>>
>>> Assuming these details, A compositor will look for DRM color properties like these:
>>>
>>> 1. Degamma plane property : To make buffers linear for Gamut mapping
>>>
>>> 2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 colorspace
>>>
>>> 3. Color space conversion plane property: To convert from YCBCR->RGB
>>>
>>> 4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer H2H
>>>
>>> 5. Gamma plane/CRTC property: to re-apply the output ST2084 curve
>>>
>>>
>> ...
>>
>>>  *
>>>  *
>>>  *
>>>  *             ┌─────────────────┐             ┌─────────────────┐           ┌─────────────────┐       ┌────────────────┐
>>>  * HDR 600 Nits│                 │HDR 600 Nits │                 │HDR600     │                 │HDR500 │                │ HDR500
>>>  *   ────────► │  Degamma        ├────────────►│  Color space    ├──────────►│  Tone mapping   ├──────►│  Gamma         │
>>>  * BT2020      │  OETF ST2084    │ BT2020      │  conversion     │BT2020     │   H2H           │BT2020 │  ST2084        │ BT2020
>>>  * YCBCR420    │                 │ YCBCR420    │ YCBCR->RGB      │RGB88      │   600->500      │RGB888 │                │ RGB888
>>>  * Non Linear  └─────────────────┘ Linear      └─────────────────┘Linear     └─────────────────┘Linear └────────────────┘ ST2084
>>>  */
>> Hi Shashank,
>>
>> I think you might have degamma and color model conversion reversed, or
>> is that a new thing in the HDR specs?
>>
>> Usually the YCbCr/RGB conversion matrix applies to non-linear values
>> AFAIU.
> Ah, that was due to the Gamut mapping block. You are right, color format conversion can happen on non-linear data (doesn't mean it can't happen on linear), but in the sequential block above, there was gamut mapping (color space conversion), which needs to be done on Linear space, and I was a bit too lazy to create separate blocks, so I just re[placed the block titles  :D.
>> There is also confusion with OETF vs. EOTF. I got that initially wrong
>> too. OETF is not just a name for inverse-EOTF but it is used in a
>> different context. Though here it seems to be just a typo.
>> OETF is inherent to a camera when it converts light into
>> electrical signals. EOTF is inherent to a monitor when it converts
>> electrical signals to light. Depending on what the electrical signals
>> have been defined to be in each step of a broadcasting chain, you might
>> need OETF or EOTF or their inverse or a different OETF or EOTF or their
>> inverse.
> 
> Yes, that was a typo. The intention was to call it inverse curve for HDR encoded buffers. It's almost 4 years (and 2 companies) since I last did HDR, so I am a bit rusty on the topic ;) .
> 
> - Shashank
> 

Thanks, Ville and Shashank. This is indeed helpful. I apologize for the late
response but I needed to take some time to do more reading and internalize some
of the HDR and CM concepts. I will send out a v2 of my patchset but realize
that it is only a small step toward the right KMS interface for HDR and CM.

Harry

>>
>> As we are talking about displays and likely assuming display-referred
>> content (not scene-referred content), we probably have no use for OETF,
>> but we could have several different EOTFs.
>>
>>
>> Thanks,
>> pq



More information about the amd-gfx mailing list