[Intel-gfx] Enabling 30-bit depth with Intel Graphics

Daniel Vetter daniel at ffwll.ch
Tue Mar 29 10:01:44 UTC 2016

On Thu, Mar 24, 2016 at 12:20:07AM -0700, Marvin Pribadi wrote:
> [resending because my previous msg didn't get through]
> Hi devs,
> I would like to get 10-bit color depth working on my system for proper
> playback of HEVC Main10 profile. I'm writing some software to do this, but
> for initial test purposes I'm using readily available applications. So far
> I'm not able to get proper 10-bit color output. My system is an Intel NUC
> NUC5i7rYH (with Iris Graphics 6100) running Ubuntu 15.10 with Intel
> Graphics Installer 1.4.0. The output is HDMI to a Dell U3011 monitor and I
> also have an Astro VA-1838 HDMI Protocol Analyzer.
> After installation of the Intel Graphics Installer 1.4.0, I can confirm
> that the output now shows as HDMI 12-bit RGB @1920x1080p60 (using the Astro
> VA-1838). However, display of gradient test images and video clips that
> have 10-bit color depth or more, are either down sampled to 8-bit depth or
> dithered when using XVideo output (confirmed by reading pixel values using
> the Astro). These tests were done using combinations of Gimp 2.9.2,
> ImageMagick, VLC 2.2.1, ffmpeg 2.7.6, and x265 10-bit 1.9.
> I noticed that the xdpyinfo utility shows that the X server display only
> supports depths 24, 1 , 4, 8, 15, 16, and 32 bpp. I read online that a
> custom xorg.conf file can be created to force the default depth to 30 bpp.
> I tried this and it does then make xdpyinfo report support for 30 bpp,
> however it causes several issues, such as the Ubuntu Unity Plugin failing
> to start, VLC reporting that there is no available XVideo output adapter,
> and sluggish VLC playback with X11 video output. Furthermore VLC playback
> using the X11 video output is still down sampled to 8-bit.
> My questions are:
>    - Is Intel Graphics for Linux supposed to support 10-bit color depth (30
>    bpp) in X?
>    - If so, what do I need to do to enable it?
>    - If not, can you clarify what is supported and provide some advice on
>    how I can get 10-bit color output using Intel Graphics?

We are still using the legacy gamma ramps which clip to 8bits. 4.7 will
have the fancy new color manager stuff using the gamma tables which don't
do that, and you should be able to get 10bit content through unharmed on
these kernels. Assuming it all works.

To get at that code either run linux-next or drm-intel-nightly.

Cheers, Daniel
Daniel Vetter
Software Engineer, Intel Corporation

More information about the Intel-gfx mailing list