HDR support in Wayland/Weston
Arnaud Vrac
rawoul at gmail.com
Thu Jan 17 11:23:51 UTC 2019
On Thu, Jan 17, 2019 at 4:26 AM Sharma, Shashank
<shashank.sharma at intel.com> wrote:
>
> Hello Arnaud
>
> Thanks for your comments, mine inline.
>
> Regards
> Shashank
> On 1/17/2019 6:38 AM, Arnaud Vrac wrote:
> > On Thu, Jan 10, 2019 at 4:02 PM Sharma, Shashank
> > <shashank.sharma at intel.com> wrote:
> >> Hello All,
> >>
> >> This mail is to propose a design for enabling HDR support in Wayland/Weston stack, using display engine capabilities, and get more feedback and input from community.
> >> Here are few points (you might already know these), about HDR framebuffers, videos and displays:
> >> - HDR content/buffers are composed in REC2020 colorspace, with bit depth 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> >> - HDR content come with their own Metadata to be applied to get the right luminance at the display device.
> >> - The metadata can be of two type 1. static 2. dynamic . For simplicity, this solution is focusing on static HDR only (HDR10 standard)
> >> - HDR content also provide its supported EOTF (electro optical transfer function) information, which is a curve (like SRGB gamma curve). One popular EOTF is PQ(ST2084).
> >> - HDR capable displays mention their EOTF and HDR metadata support information in EDID CEA-861-G blocks.
> >> - Normal SRGB buffers are composed in SRGB color space following REC709 specifications.
> >> - For accurate blending in display engines, we need to make sure following:
> >> - All the buffers are in same colorspace (Rec 709 or Rec 2020)
> >> - All the buffers are liner (gamma/EOTF removed)
> >> - All the buffers are tone mapped in same zone (HDR or SDR)
> >>
> >> Please refer to the block diagram below, which presents a simple case of a HDR P010 movie playback, with HDR buffers as video buffers, and SDR buffers as subtitles. The subsystem looks and works like this:
> >> - A client decodes the buffer (using FFMpeg for example) and gets the two buffers, one with video (HDR) and one subtitles (SDR)
> >> - Client passes following information to the compositor:
> >> - The actual buffers
> >> - Their colorspace infromation, BT2020 for HDR buffer, REC709 for SDR buffer (planning to add a new protocol extension for this)
> >> - The HDR metadata of the content (planning to add new protocol for this)
> >>
> >> - Compositors actions:
> >> - Reads the End display's HDR capabilities from display EDID. Assume its an HDR HDMI monitor.
> >> - Compositor tone maps every view's framebuffer to match tone of end display, applying a libVA filter. In this example:
> >> - The SDR subtitles frame will go through SDR to HDR tone mapping (called S2H)
> >> - The HDR video frame will go through HDR to HDR tone mapping (called H2H) if the HDR capabilities of monitor and content are different.
> >> - Now both the buffers and the monitor are in the same tone mapped range.
> >> - As the end display is HDR capable, and one of the content frame is HDR, the compositor will prepare all other planes for color space conversion (CSC) from REC709->REC2020 using plane CSC property.
> >> - As the CSC and blending should be done in liner space, compositor will also use plane level degamma to make the buffers linear.
> >> - These actions will make sure that, during blending:
> >> - All the buffers are in same colorspace (REC2020)
> >> - All the buffers are linear
> >> - All the buffers are tone mapped (HDR)
> >> - The plane level color properties patch, for DRM can be found here: https://patchwork.freedesktop.org/series/30875/
> >> - Now, in order to re-apply the HDR curve, compositor will apply CRTC level gamma, so that the output buffer is non-linear again.
> >> - To pass the output HDR information to kernel, so that it can create and send AVI-info-frames to HDMI, compositor will set Connector HDR metadata property.
> >> - Code for the same can be found here: https://patchwork.freedesktop.org/series/25091/
> >> - And they will ever live happily after :).
> >>
> >> Please provide inputs, feedbacks and suggestions for this design and plan, so that we can improve out half cooked solution, and start sending the patches.
> >>
> >> +------------------+ +-------------------+
> >> | SDR Buffer subtitles | HDR Buffer video
> >> | (REC 709 colorsp) | (REC 2020 colorsp |
> >> | | | |
> >> +-------+----------+ +-------+-----------+
> >> | |
> >> | |
> >> | |
> >> +------v---------------------------v------------+ +--------------+
> >> | Compositor: v | | LibVA |
> >> | - assigns views to overlays +---------> Tone mapping |
> >> | - prepare plane/CRTC color properties <---------+ SDR to HDR |
> >> | for linear blending in display | | HDR to SDR |
> >> +------+-----------------------------+----------+ +--------------+
> >> | |
> >> | Tone mapped | Tone mapped
> >> | non-linear-Rec709 | non-linear Rec2020
> >> +------v------+ +-------v--------+
> >> SRGB Degamma | |EOTF as degamma |
> >> |(Plane) | |(Plane) |
> >> | | | |
> >> +------+------+ +-------+--------+
> >> Tone mapped linear Rec 709 | |
> >> +------v------+ | Tone mapped
> >> | CSC/CTM | | non-linear Rec2020
> >> | REC709->2020| |
> >> | | |
> >> +------+------+ |
> >> | Tone mapped linear |
> >> | Rec 2020 |
> >> +------v-----------------------------v---------+
> >> | Blender |
> >> | |
> >> +--------------------+-------------------------+
> >> | Tone mapped linear Rec2020
> >> +--------------------v-------------------------+ Tone mapped
> >> | OETF(CRTC Gamma, post blending) | non-linear Rec2020 +------------------+
> >> | +----------------> | HDMI monitor |
> >> +----------------------------------------------+ +------------------+
> >>
> > Hi Shashank,
> >
> > Just in case you have missed it, Ville Syrjälä already provided proof
> > of concept support for HDR in weston a while ago [1].
> Yes, we are well aware of the solution Ville proposed, as we work for
> the same group in Intel :-). We extended Ville's solution with addition
> on tone mapping, and introduced changes like using the display engine
> for blending, etc.
Nice ! FYI on the product I'm working on [0], we do have HDR support
using weston. Unfortunately we're not using the DRM backend, so seeing
this kind of effort will certainly make switching to the upstream DRM
backend much easier. There are a lot of constraints though to be able
to do 4k tonemapping in real time of multiple buffers, which would be
hard to translate in a generic solution.
For example, when playing HDR content we make the assumption that
video surfaces are always in the bottom most layer, and if they are
not, we can also assume the surfaces are opaque, which allow us to
pierce a hole in the scenegraph and still put them on the bottom most
layers. The rest of the buffers are composited using gl-renderer, and
the resulting framebuffer is tonemapped to HDR. Unfortunately, the
framebuffer is still in ARGB32 format, as the display hardware only
support RGBA1010102 format for 10-bits formats, and two bits of alpha
is not enough. So we resort to dithering in the end and the result is
acceptable for UI content. We cannot always render everything on the
GPU since some video content is secure.
Anyway, I think first implementing a generic solution is preferred and
then hw specific limitations can be handled later on.
[0] https://www.iliad.fr/presse/2018/DP_Freebox_Delta_041218_Eng.pdf
> > The proposal is missing many important bits like negotiation of the
> > supported output features with the client, double buffering the new
> > colorspace related surface properties, using more of the hardware
> > capabilities, performance issues, etc...
>
> > Also, the added protocols are
> > probably too simple as far as color management is concerned.
> Agree, there are two reasons for that:
> - This proposal is a very high level design focusing the changes
> required only to drive HDR video playback, in the real implementation,
> you would see many of those mentioned. I think its too early to talk
> about performance as we are still in design stage.
> - As we have been discussing in parallel threads, HDR is too big a
> feature, and we don't want to add too much of code in a single shot, and
> create unwanted regressions and maintenance nightmares, rather, the aim
> is to create small, modular, scalable, easy to review and test kind of
> feature set, which might be targeting a very specific area, and
> gradually complete this feature.
>
> But I would like to hear more about double buffering of the new
> colorspace related surface properties if you can please elaborate more ?
The colorspace related properties should be applied atomically when
commiting the wl_surface. This is not done in Ville's patches, so
there might be some rendering glitches when changing the colorspace
while the surface is displayed.
> > Still,
> > this proposal allows better supporting video playback and some patches
> > make a lot of sense and could probably be used as inspiration.
> Sure, this might be a bit clear when we start publishing the patches.
> - Shashank
> > Regards,
> > -Arnaud
> >
> > [1] https://lists.freedesktop.org/archives/wayland-devel/2017-December/036403.html
More information about the wayland-devel
mailing list