10bit output via KMS

Volker Vogelhuber v.vogelhuber at digitalendoscopy.de
Wed Aug 2 15:55:24 UTC 2017


Hi,
>>  >>> On 24 July 2017 at 14:56, Volker Vogelhuber >>> 
<v.vogelhuber at digitalendoscopy.de> wrote: >>>> I wonder if it is 
possible to have drmModeAddFB2 to handle >>>> 'X', 'R', '3', '0' at all. 
So is this supported in any way? >>> Secondly, you're correct that you 
would need (theoretically) to extend >>> drmModeAddFB2, however it 
already has DRM_FORMAT_XRGB2101010 and >>> DRM_FORMAT_XBGR2101010 for 
i965 and above, which covers everything in >>> the last 10 years. This 
is the 'XR30' FourCC you mention, so it should >>> already be there and 
working in the kernel. >> This question probably better fits to the drm 
mailing list, although the >> discussion started on the mesa list. >> >> 
I got it working now that my 10bit buffers are correctly displayed >> on 
a Samsung QM49F via drmModeAddFB2 and the 'X', 'R', '3', '0' >> fourcc 
code. Based on the Samsung's datasheet it should support 10bit >> 
signals, although the EDID does not seem to contain a vendor >> specific 
data block that would confirm that assumption. >> >> I have connected an 
Apollo Lake module with a display port cable >> to that display but I'm 
unsure if it's really 10bit or if there is some >> conversion logic 
somewhere in between that may downsample my >> 10 bit buffer to 8bit 
before sending it via displayport to the display. > > Well you could 
just try to display a 10bit image with a gradient going from black to 
white. > The difference between 8bit and 10bit should be obvious even to 
the naked eye. Thanks Christian. I tried your proposal but as I already 
assumed there
seem to be no difference. I switched the display to an LG 27UD58-B
which supports the vendor specific data block and also states that
it supports 36bpp as well as 30bpp input signals. It even has a deep
color option to be enabled in it's menu. But apparently the apollo
lake GPU does not care about that, or at least it makes no
difference regarding my test buffer. What I verified is that the
buffer generated by OpenGL is really 10bit. When I pass the 10bit
buffer to drmModeAddFB2 as an 8bit buffer it is shown with various
color artifacts because of the wrong bit mapping.

I just stumbled upon a slide from Andy Ritger where he mentions
that there are some components still missing for HDR10.
(https://www.x.org/wiki/Events/XDC2016/Program/xdc-2016-hdr.pdf)
As far as I understand it, he proposes that the driver should
accept an scRGB FP16 buffer and that this part is still missing in
the Linux kernel. My question is: is that only true for HDR10
with regards to SMPTE 2086, or is it even not possible to
output a "deep color" DRM_FORMAT_XBGR2101010 buffer as
a 10bit BPP signal via displayport.



>> So far I stumbled over some patches regarding color management  >> in the drm part of the kernel, but couldn't figure out how one 
could >> configure what is really send to the display. It seem to mostly 
be used to manipulate gamma values and so on. As the only indicator >> 
what is provided to the display is the buffer format handled by >> 
drmModeAddFB2 I doubt this is enough to configure the signal. >> 
Otherwise one could argue that the link speed of the display signal >> 
would have to toggle every time I send a different buffer format, which 
is certainly not the case. So what portion of the kms/drm chain do I >> 
miss currently? The drmModeModeInfo pointer has a clock field, but I 
guess this is meant to be for the pixel clock not the clock of the 
serialized signal. Probably that's why I couldn't find modelines for 
4k at 60 especially for 10bit compared to 8bit. >> >> Thanks >> Volker >> 
 >> >> >> _______________________________________________ >> dri-devel 
mailing list >> dri-devel at lists.freedesktop.org >> 
https://lists.freedesktop.org/mailman/listinfo/dri-devel > >



More information about the dri-devel mailing list