A solution for gamma-adjustment support in Wayland

Graeme Gill graeme2 at argyllcms.com
Thu Jan 5 11:58:56 UTC 2017


Chris Murphy wrote:

> On Mon, Dec 26, 2016 at 10:25 PM, Graeme Gill <graeme2 at argyllcms.com

>     I'm not sure what you are thinking of here - the existing per channel
>     VideoLUT API's are universally simple, and the only temptation in
>     making changes to this might be in the direction of better coordinating
>     competing use of this shared resource.
>
> I don't see why Wayland or the compositor or display server would need to be aware of it.
> It should be independent of X vs Wayland vs Mir - even if eventually X and Mir are going
> away.

Hi Chris,

The reason is standardization (i.e. reliability of a resource that an application
writer can program to), and coordination of HW identification.

If it relies on some other API that is unconnected with Wayland, then
one cannot confidently write a "Wayland" application that performs these operations.
For instance, say that Wayland becomes the display server of choice on
Linux and BSD, but on Linux one has to use DRM/KMS to access the VideoLUT,
but on BSD this isn't available, and (say) some ioctl to some /dev/crtcXX is needed, etc.
i.e. this is a platform fragmentation nightmare. Contrast this with it being part
of an optional Wayland protocol - the Linux based compositors would provide the
Wayland API via DRM/KMS, while the BSD based compositor would provide the Wayland
via its ioctl. The application writer has one cross platform target - Wayland.

The second reason is HW identification. Different interfaces may well
identify the same element by different identifiers or in different
orders. If so, how does one know which corresponds with which ?
(i.e. which VideoLUT corresponds with which Output and/or CRTC, etc.)

> A possible advantage of leveraging more sophisticated capability than just a LUT defined
> TRC in the video card, is the ability to do what high end displays are doing now, where
> they expect (in fact insist) on linear video card LUT, and do their own 3D LUT transform
> in the display itself. I don't know if this hardware is using an ASIC, but I can give its
> user space software a CMYK ICC profile, and the software then creates a kind of devicelink
> from CMYK to display RGB and pushes that 3D LUT to the display. I can play movies and
> video in any program, and they are all subject to this lookup at playback speed. The more
> common usage though is to make a wide gamut display have a smaller gamut, e.g. Rec 709/sRGB.

Hmm. I have my doubts that any of the current or near future graphic cards offer
3D LUTs in hardware (although very doable using the GPU of course).

But the same logical problems with it being at all useful for display
from a general purpose computer platform remain as for a matrix
being available. I can imagine it being great for having the display
emulate a different colorspace in hardware, but a color managed
desktop doesn't want that - it wants to make the full display
capability available to every application, and each application
can then decide what colorspace it needs to emulate for its
particular inputs.

I can imagine using a hardware 3D LUT for some sort of super
advanced display calibration, where it is used to linearize
internally (although even this may be limited, since it
can't change the values at the gamut surface if it is
not to diminish the gamut or introduce "flat spots"), but
I'm not sure if the effort to exploit this capability would
be worthwhile, even if it became a widely available standard.

As for software to create some sort of virtual device link -
yes I've been thinking through some of those sorts of
scenarios in connection to the idea that a Wayland
color management extension support ICC device links, but there
is a drawback to splitting the overall conversion into
two cLUT lookups, and that is conversion quality. It's
noticeably higher quality to do a single lookup from
source to native display gamut. So the best usage of such
hardware is likely to be using the 1D LUTs to linearise,
and set matrix's and 3DLUTs to do nothing.

> The conventional wisdom using video LUTs for calibration came about with CRTs. Pretty much
> all of that has to be thrown out with LCD's - they have a natural TRC that's nothing like
> a CRT, what the manufacturers have done is glue in hard wired transform to make them
> behave as if they have a native sRGB like TRC. Trying to unwind that with an 8 bit (and in
> rare cases 10 bits today) to make them better behaved is pretty questionable, and a huge
> assumption that we're in fact improving the display performance by "calibrating" them by
> depending on video card LUTs in the first place. That is exactly why the high end display
> don't use it at all. And for laptop displays my experience overwhelmingly has been that
> changing the videocard LUT produces worse results than not calibrating it, and just making
> an ICC profile and letting high precision display compensation sort it all out.

OK - I'll take that on board since I haven't played with LCD's to quite the same
extent as CRT's in regards to calibration. (Note that this may depend
to some degree on the calibration software being used. I understand
that most of the commercial software doesn't attempt to create
quite as high resolution calibration curves as Argyll does.)
(Using the displays internal 1D Luts is rather theoretical though, given
the undocumented, unreliable and un-standardised nature of the interfaces to it.)

Cheers,
	Graeme.



More information about the wayland-devel mailing list