<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Dec 26, 2016 at 10:25 PM, Graeme Gill <span dir="ltr"><<a href="mailto:graeme2@argyllcms.com" target="_blank">graeme2@argyllcms.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
Hi,<br>
<span class="gmail-"><br>
Daniel Stone wrote:<br>
> 'The size and depth' and 'the CLUT' are no longer applicable. Colour<br>
> management units, incorporating two LUTs and a matrix (coarsely,<br>
> degamma-transform-regamma) are becoming rather common now. These units<br>
> can be present per-plane as well as per-output/pipe. Sometimes you<br>
> have to make tradeoffs.<br>
<br>
</span>Just because some hardware is a lot more capable, is no reason to ignore<br>
supporting currently expected functionality.<br>
<br>
Simple per channel hardware lookup tables have been supported<br>
by graphics card software for a very long time (Our X terminals<br>
in the late 80's certainly had them), so this level of<br>
support can be assumed as almost universal, and there are<br>
well understood reasons for using such hardware to set<br>
the state of the display for color management purposes at<br>
least (which I have articulated elsewhere). Usage of further<br>
HW capabilities (matrices etc.) are much less clear - no<br>
application interested in fully utilizing a displays best<br>
possible gamut has any use for them (since they can only<br>
reduce the gamut), and there is much more flexibility in<br>
dealing with non color managed applications in the applications<br>
and/or compositor, so that they can co-exist with full color aware<br>
applications.<br>
<span class="gmail-"><br>
> The point is that I'm extremely wary of copying X11 by way of encoding<br>
> this into an API; it's a truly dizzying space to even enumerate, let<br>
> alone abstract.<br>
<br>
</span>I'm not sure what you are thinking of here - the existing per channel<br>
VideoLUT API's are universally simple, and the only temptation in<br>
making changes to this might be in the direction of better coordinating<br>
competing use of this shared resource.<br></blockquote><div><br><br>I don't see why Wayland or the compositor or
display server would need to be aware of it. It should be independent of X vs Wayland vs Mir - even if eventually X and Mir are going away. <br> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
I agree that expanding such API's to encompass more advanced HW capabilities is<br>
something of a project, and might not even be a good direction to proceed<br>
if other alternatives are a more enticing use of time and effort (installable<br>
shaders in the rendering pipeline ??)<br></blockquote><div><br><br><br></div><div>A possible advantage of leveraging more sophisticated capability than just a LUT defined TRC in the video card, is the ability to do what high end displays are doing now, where they expect (in fact insist) on linear video card LUT, and do their own 3D LUT transform in the display itself. I don't know if this hardware is using an ASIC, but I can give its user space software a CMYK ICC profile, and the software then creates a kind of devicelink from CMYK to display RGB and pushes that 3D LUT to the display. I can play movies and video in any program, and they are all subject to this lookup at playback speed. The more common usage though is to make a wide gamut display have a smaller gamut, e.g. Rec 709/sRGB.<br><br></div><div>The conventional wisdom using video LUTs for calibration came about with CRTs. Pretty much all of that has to be thrown out with LCD's - they have a natural TRC that's nothing like a CRT, what the manufacturers have done is glue in hard wired transform to make them behave as if they have a native sRGB like TRC. Trying to unwind that with an 8 bit (and in rare cases 10 bits today) to make them better behaved is pretty questionable, and a huge assumption that we're in fact improving the display performance by "calibrating" them by depending on video card LUTs in the first place. That is exactly why the high end display don't use it at all. And for laptop displays my experience overwhelmingly has been that changing the videocard LUT produces worse results than not calibrating it, and just making an ICC profile and letting high precision display compensation sort it all out. <br></div></div><br clear="all"><br>-- <br><div class="gmail_signature">Chris Murphy</div>
</div></div>