[RFC 0/1] Color manager calibration protocol v1

Pekka Paalanen ppaalanen at gmail.com
Tue Apr 16 10:45:33 UTC 2019

On Sun, 14 Apr 2019 12:57:47 +0200
Erwin Burema <e.burema at gmail.com> wrote:

> Without a way to calibrate/profile screens an color management
> protocol looses a lot of its value. So to add this missing feature I
> wrote the following protocol.
> The idea is that the calibration/profiling SW only sets the RGB
> triplet and then the compositor is responsible to draw a rectanglular
> region on the selected output screen, since not all calibration tools
> will be at the center of the screen a user should be able to modify
> the placement of this rectanglular region. Unless specified the
> monitor profile (if any) should not be applied but the GPU curve
> should, currently to set a new curve the calibration tool should
> generate a new ICC profile with the wanted curve in the VCGT tag (I
> am not sure if this is the best option but would make the most
> universal one). In the end after profiling the last uploaded ICC
> could then be saved (although a compositor is not required to honor
> the request in that case it should send the not saved error). If the
> compositor doesn't save or the connection with this protocol is
> broken the compositor should restore previous settings.


I only took a very quick glance, but I do like where this design is
going. I'll refrain from commenting on wl_surface vs. not for now

Forgive me my ignorance, but why is the "GPU curve" needed to be a
custom curve provided by the client?

My naive thinking would assume that you would like to be able to
address the pixel values on the display wire as directly as possible,
which means a minimum of 12 or 14 bits per channel framebuffer format
and an identity "GPU curve".

Is the reason to use the "GPU curve" that you assume there is a 8 bits
per channel framebuffer and you need to use the hardware LUT to choose
which 8 bits wide range of the possibly 14 bits channel you want to
address? (Currently a client cannot know if the framebuffer is 8 bits
or less or more.)

Your protocol proposal uses the pixel encoding red/green/blue as uint
(32-bit) per channel. Would it be possible to have the compositor do
the LUT manipulation if it needs to avoid the intermediate rounding
caused by a 8 bit per channel framebuffer or color pipeline up to the
final LUT?

If such "GPU curve" manipulation is necessary, it essentially means
nothing else can be shown on the output. Oh, could another reason to
have the client control the "GPU curve" be that then the client can
still show information on that output, since it can adjust the pixel
contents to remain legible even while applying the manipulation. Is
that used or desirable?

Btw. how would a compositor know the bit depth of a monitor and the
transport (wire)? I presume there should be some KMS properties for
that in addition to connector types.

-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20190416/542a0233/attachment-0001.sig>

More information about the wayland-devel mailing list