The state of Quantization Range handling
Pekka Paalanen
ppaalanen at gmail.com
Fri Nov 18 10:15:30 UTC 2022
On Thu, 17 Nov 2022 22:13:26 +0100
Sebastian Wick <sebastian.wick at redhat.com> wrote:
> Hi Dave,
>
> I noticed that I didn't get the Broadcast RGB property thanks to you
> (more below)
>
> On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
> <dave.stevenson at raspberrypi.com> wrote:
> >
> > Hi Sebastian
> >
> > Thanks for starting the conversation - it's stalled a number of times
> > previously.
> >
> > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick <sebastian.wick at redhat.com> wrote:
> > >
> > > There are still regular bug reports about monitors (sinks) and sources
> > > disagreeing about the quantization range of the pixel data. In
> > > particular sources sending full range data when the sink expects
> > > limited range. From a user space perspective, this is all hidden in
> > > the kernel. We send full range data to the kernel and then hope it
> > > does the right thing but as the bug reports show: some combinations of
> > > displays and drivers result in problems.
> >
> > I'll agree that we as Raspberry Pi also get a number of bug reports
> > where sinks don't always look at the infoframes and misinterpret the
> > data.
> >
> > > In general the whole handling of the quantization range on linux is
> > > not defined or documented at all. User space sends full range data
> > > because that's what seems to work most of the time but technically
> > > this is all undefined and user space can not fix those issues. Some
> > > compositors have resorted to giving users the option to choose the
> > > quantization range but this really should only be necessary for
> > > straight up broken hardware.
> >
> > Wowsers! Making userspace worry about limited range data would be a
> > very weird decision in my view, so compositors should always deal in
> > full range data.
>
> Making this a user space problem is IMO the ideal way to deal with it
> but that's a bit harder to do (I'll answer that in the reply to
> Pekka). So let's just assume we all agree that user space only deals
> with full range data.
Limited range was invented for some reason, so it must have some use
somewhere, at least in the past. Maybe it was needed to calibrate mixed
digital/analog video processing chains with test images that needed to
contain sub-blacks and super-whites, to make sure that sub-blacks come
out as the nominal black etc. Just because desktop computers do not
seem to have any need for limited range, I personally wouldn't be as
arrogant as to say it's never useful. Maybe there are professional
video/broadcasting needs that currently can only be realized with
proprietary OS/hardware, because Linux just can't do it today?
Why would TVs support limited range, if it was never useful? Why would
video sources produce limited range if it was always strictly inferior
to full range?
Even digital image processing algorithms might make use of
out-of-unit-range values, not just analog circuitry for overshoot.
But no, I can't give a real example, just speculation. Hence it's fine
by me to discard limited range processing for now. Still, what I
explain below would allow limited range processing without any extra
complexity by making the KMS color pipeline better defined and less
limiting for userspace.
> > How would composition of multiple DRM planes work if some are limited
> > range and some are full but you want limited range output? Your
> > hardware needs to have CSC matrices to convert full range down to
> > limited range, and know that you want to use them to effectively
> > compose to limited range.
> > In fact you can't currently tell DRM that an RGB plane is limited
> > range - the values in enum drm_color_range are
> > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].
Yeah, that's because range conversion has been conflated with
YUV-to-RGB conversion, and the result is always full-range RGB in
practise, AFAIU. There is no way to feed limited range color into the
further color pipeline in KMS, but that's actually a good thing.(*)
The following is my opinion of the future, as someone who has been
thinking about how to make HDR work on Wayland while allowing the
display quality and hardware optimizations that Wayland was designed
for:
Userspace should not tell KMS about a plane being limited range at all.
The reason is the same why userspace should not tell KMS about what
colorspace a plane is in.
Instead, userspace wants to program specific mathematical operations
into KMS hardware without any associated or implied semantics. It's
just math. The actual semantics have been worked out by userspace
before-hand. This allows to use the KMS hardware to its fullest effect,
even for things the hardware or KMS UAPI designers did not anticipate.
IMO, framebuffers and KMS planes should ultimately be in undefined
quantization range, undefined color space, and undefined dynamic range.
The correct processing of the pixel values is programmed by per-plane
KMS properties like CTM, LUT, and more specialized components like
quantization range converter or YUV-to-RGB converter (which is just
another CTM at a different point, really) where userspace explicitly
programs the *operation*, and not the input and output types hoping the
driver and hardware does something sensible.
In that design, there is no problem at all to blend multiple planes of
differing quantization ranges together. Userspace first chooses the
blending space, a radiometrically linear RGB limited range BT.709 space
for example, and then programs each plane to produce exactly that. Then
CRTC properties are programmed to produce the desired type of output
signal. Finally, connector properties are programmed to send the
appropriate metadata to the sink. Of course, userspace takes the sink
capabilities into account before deciding all this.
The thing KMS UAPI is missing are the per-plane properties.
(*) The reason it is a good thing that one cannot have limited range
framebuffers is that it would raise problems on how to handle pixel
values outside of the nominal range, that is, the sub-black and
super-white channel values. The immediate problem is that LUT stages
need normalized input and they cannot extrapolate. So if nominal
quantization range is normalized to 0.0-1.0 for LUT input, sub-black
would be negative values and super-white would be greater than 1.0
values, which a LUT cannot sensibly handle. That makes the whole
concept of limited range problematic in the color pipeline. But, if the
color pipeline elements like LUTs are defined *as if* the data was
always full range, it becomes well-defined and useful how the elements
work, and userspace can produce a proper programming that is guaranteed
to work.
By completely removing the concept of quantization range and its
"automatic" handling from the KMS color pipeline and adding processing
elements to do scaling+offset at suitable stages, we gain the ability
to accept, process, blend, and produce pixels in any quantization
range, color space or dynamic range at will.
Another thing such an approach solves is how to interpret
floating-point pixel data which can naturally represent
out-of-unit-range values.
Thanks,
pq
> > Cheers.
> > Dave
> >
> > [1] https://elixir.bootlin.com/linux/latest/source/include/drm/drm_color_mgmt.h#L84
> > [2] https://elixir.bootlin.com/linux/latest/source/drivers/gpu/drm/drm_edid.c#L6756
> > [3] https://elixir.bootlin.com/linux/latest/source/drivers/gpu/drm/drm_edid.c#L5642
> >
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20221118/311e4726/attachment-0001.sig>
More information about the wayland-devel
mailing list