HDR support in Wayland/Weston

Pekka Paalanen ppaalanen at gmail.com
Fri Feb 22 16:00:08 UTC 2019


On Mon, 18 Feb 2019 10:44:15 -0700
Chris Murphy <lists at colorremedies.com> wrote:

> On Fri, Feb 1, 2019 at 3:43 AM Pekka Paalanen <ppaalanen at gmail.com> wrote:
> >
> > On Thu, 31 Jan 2019 12:03:25 -0700
> > Chris Murphy <lists at colorremedies.com> wrote:
> >  
> > > I'm pretty sure most every desktop environment and distribution have
> > > settled on colord as the general purpose service.
> > > https://github.com/hughsie/colord
> > > https://www.freedesktop.org/software/colord/  
> >
> > FWIW, Weston already has a small plugin to use colord. The only thing
> > it does to apply anything is to set the simplest form of the gamma
> > ramps.  
> 
> Short version:
> Having just briefly looked that code, my best guess is colord is
> probably reading a vcgt tag in the ICC profile for the display, and
> applying it to the video card LUT (or one of them anyway).
> 
> Super extra long version:
> In ancient times (two decades+) there was a clear separation between
> display calibration (change the device) and characterization (record
> its behavior). Calibration was a combination of resetting and fiddling
> with display controls like brightness and contrast, and then also
> leveraging the at best 8 bit per channel LUT in the video card to
> achieve the desired white point and tone curve per channel.
> Characterization, which results in an ICC profile, happens on top of
> that. The profile is valid only when the calibration is applied, both
> the knob fiddling part and the applicable LUT in the video card. The
> LUT information used to be kept in a separate file, and then circa 15
> years ago Apple started to embed this information into the ICC profile
> as the vcgt tag, and the operating system display manager reads that
> tag and applies it to the video card LUT prior to login time. This has
> become fairly widespread, even though I'm not finding vcgt in the
> published ICC v4.3 spec. But they do offer this document:
> www.color.org/groups/medical/displays/controllingVCGT.pdf
> 
> There are some test profiles that contain various vcgt tags here:
> http://www.brucelindbloom.com/index.html?Vcgt.html
> 
> You really must have a reliable central service everyone agrees on to
> apply such a LUT, and then also banning anything else from setting a
> conflicting LUT. Again in ancient times we had all sorts of problems
> with applications messing around with the LUT, and instead of reading
> it first and restoring it the same way, they just reset it to some
> default, thereby making the ICC profile invalid.
> 
> The primary reason, again historically, for setting the white point
> outside of software (ideally set correctly in the display itself; less
> ideal is using a video card LUT) is because mismatching white points
> are really distracting, it prevents proper adaptation, and therefore
> everything looks wrong. Ironically the color managed content is
> decently likely to look more wrong than non-color-managed content. Why
> would there be mismatching white points? Correct white point fully
> color managed content in an application window, but not any other
> application or the surrounding UI of the desktop environment.
> 
> Ergo, some kind of "calibration" of white point independent of the
> color management system. Sometimes this is just a preset in the
> display's on-screen menu. Getting the display white point in the ball
> park of target white point means a less aggressive LUT in the video
> card, or even ideally a linear LUT.
> 
> Alternatively, you decide you're going to have some master of all
> pixels. That's the concept of full display compensation, where every
> pixel is subject to color management transforms regardless of its
> source application, all normalized to a single intermediate color
> space. In theory if you throw enough bits at this intermediate space,
> you could forgo the video card LUT based calibration.
> 
> The next workflow gotcha, is multiple displays. In designing a color
> management system for an OS you have to decide if applications will
> have the option to display across multiple displays, each of which
> could have their own display profile.
> 
> I agree with Graeme that having different pipelines for calibration or
> characterization is asking for big trouble. The thing I worry about,
> is whether it's possible for each application to effectively have
> unique pipelines because they're all using different rendering
> libraries. The idea we'd have application specific characterization to
> account for each application pipeline just spells doom. The return of
> conflicting video card LUTs would be a nightmare.

Hi Chris,

that is some interesting background, but I feel like I didn't quite
catch the point.

If the CRTC color management pipeline (LUT-CTM-LUT + maybe more) is
programmed according to the monitor's color profile, where would those
"conflicting video card LUTs" arise from?


Thanks,
pq
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20190222/aaffc17b/attachment-0001.sig>


More information about the wayland-devel mailing list