[ANNOUNCE] Wayland/Weston/Mesa HDR support (proof of concept)
ville.syrjala at linux.intel.com
Thu Dec 21 14:21:35 UTC 2017
Here's a quick proof of concept implementation of HDR support
I'm not posting this as patches right now because I'm not sure
that would do much good given how rough this is. But here are
all the repos/branches:
The kernel HDR bits were partially done by Uma Shankar, the rest
I hacked together myself.
As far as Wayland protocol goes I'm adding three new
extensions (should probably just have one with several requests?):
- zwp_colorspace_v1 - Specify the primaries/whitepoint chromacities
and transfer function for a surface
- zwp_ycbcr_encoding_v1 - Specify the encoding for YCbCr surfaces
- zwp_hdr_metadata_v1 - Allow the client to pass HDR metadata to
Note that I've not given any thought to how the compositor might
advertize its capabilities.
I also hacked in a bunch of 10bpc+ YCbCr support to the protocol and
Weston so that I can actually get some HDR video onto the screen.
On the Mesa side I've crudely implementated the following egl/vk
(sidenote: these egl extension don't seem to match CTA-861.3 nicely
when it comes to the min/max luminance stuff)
VK_EXT_hdr_metadata I plugged in for anv only, but the implementation
is in the common wayland wsi code. Note that I haven't actually tested
the vulkan stuff at all because I don't talk Vulkan (at least not yet).
Also note that I've not connected up the HDR metadata pipeline
properly. The client can provide the metadata, but the compositor
doesn't actually pass it on to the display. For the time being the
HDR metadata that gets passed to the display is partially specified
in weston.ini and partially just hardocded (see
The Weston implementation involves a bunch of shaders and matrices to
do the ycbcr->rgb, "degamma", csc for each surface, blend it all as
linear RGB into an fp16 fbo, and finally blit that out to the final
framebuffer while applying the ST2084 PQ transfer function in the
The reason for the fp16 fbo is that we store the full 10000 nits of
linear RGB. That needs plenty of precisions in the low end so your
regular 10bpc fb doesn't seem to cut it. And also the display gamma LUT
doesn't have enough input precision for it either. Sadly there's no
fixed function hardware in the GPU to do the ST2084 PQ when blending.
When the output is not HDR I do skip the fp16 fbo step and use the
gamma LUT in the display engine instead.
Another approach to the precisions problem might be to not store the
entire 10000 nits of linear, and just cut off the super bright stuff
(your display can't show it anyway). But I've not really bothered to
figure out how low in nits we'd have to go here, probably too low.
Maybe blending as sRGB and the doing sRGB->PQ with the gamma LUT might
help a little bit?
Ideally we would bypass this all for a single fullscreen HDR surface
and just pass the PQ encoded data directly through. But I've not
implemented that. In fact I even disable the buffer_age damage stuff
when using the fp16 fbo, so we'll recompose the entire screen every
time. Yeah, I'm lazy.
Another thought that occurred to me was that it shouldn't be too hard
to make Weston do some tone mapping when there's a HDR client and no
HDR screen. To that end I included the ACES colorspaces in my
colorspace list, but I didn't actually look into plugging the ACES tone
mapping curve into the shaders. Might be a fun excercise, even though
the practical applications might be close to nil. Probably better to
not advertize HDR/wide gamuts when we can't actually output the stuff,
and instead let the client do its own tone mapping.
OK, so what can you do with this? I've included a few test clients:
Just a copy of simple-egl but it uses the egl extension to specify
the colorspace, and produces ST2084 PQ encoded data when asked
Uses ffmpeg to decode video into shm buffers, and sets the
colorspace/ycbcr encoding etc. appropriately. Ie. this one can
actually output HDR video
Here's a weston.ini snippet that gets you outputting HDR:
Hardware wise you'll need a HDR capable display obviously, and
you'll need an Intel Geminilake GPU. Older Intel platforms don't
support the HDR infoframe, so the display wouldn't know what to do
with the data you're feeding it.
As for the future, right now I don't really have any solid plans on
continuing to develop this. I might dabble with it a bit more out of
curiosity, but I'm more hoping we can find other people to move this
More information about the dri-devel