Gamma correct rendering with Wayland and Weston
John Kåre Alsaker
john.kare.alsaker at gmail.com
Fri Sep 28 08:52:17 PDT 2012
Currently most applications are blissfully unaware of the concept of
gamma which affects pretty much all rendering operations. Resizing,
alpha-blending, blurring, anti-aliasing, etc. are all operations that
should be done on physical or linear luminance values. However if we
stored linear values in 8-bit integers, there would be posterization
on dark colors as there aren't enough precision to describe these.
This is because physical lighting values that doesn't correspond to
how humans perceive brightness. To get around that RGB color channels
are usually stored with a gamma so the 8-bit ranges more closely
matches how humans perceive brightness. Instead of converting to and
from the gamma encodes values, applications simply perform all
rendering operations treating the values as linear. The result of
doing rendering with in gamma space is decent, but not correct. It is
especially noticeable in games which operate with physical light and
anti-aliasing for shapes and fonts. sRGB's gamma function is the most
common way of encoding these linear values.
To fix the rendering operations there's usually two approaches:
- Convert to and from sRGB gamma during rendering.
- Move to a format with a larger bit depth, to avoid using gamma.
OpenGL and DirectX have introduced support in graphics hardware to
support the first option. Support for that is also useful to convert
the large base of sRGB content to linear gamma. It would be useful to
tap into this for Weston/Mesa.
Wayland and Weston should support clients which render in both sRGB
and linear gamma as well as clients converting to and from sRGB gamma.
This means that Wayland clients needs away to tell the compositor
which format they output. That is made more complicated by the fact
that Wayland uses premultiplied alpha. Clients with alpha which
renders directly in sRGB gamma have a incompatible format to those
which converts to and from sRGB gamma. Those who renders directly in
sRGB gamma have color channels like this: srgb_encode(color) * alpha,
while those who convert from and to use: srgb_encode(color * alpha).
The latter format is much simpler to covert to and from and is the one
used by pixman, OpenGL and DirectX. So we have 2 different encodings
for RGB formats without alpha: srgb_encode(color) and color. We have 3
encodings for formats with gamma: srgb_encode(color) * alpha,
srgb_encode(color * alpha) and color * alpha.
For the purpose of shortness I'll name these encodings:
- srgb: srgb_encode(color)
- rgb: color
- srgb_a: srgb_encode(color) * alpha
- srgba: srgb_encode(color * alpha)
- rgba: color * alpha
Currently for wl_shm or wl_drm the encodings are not explicitly
specified. The compositor has no way to know which encoding to use.
This is something that has to change. We could add more wl_shm and
wl_drm formats for each encoding or let the encoding be a property of
wayland buffers or surfaces.
Weston should also support doing it's rendering in sRGB gamma (for
performance reasons) and linear gamma (for correctness).
For wl_shm buffers we can simply convert the texture data to a
desirable format when uploading it.
For wl_drm things get a bit more complex since it involves the EGL
stack. Here is a list of which conversions should happen with the
various encodings in EGL and OpenGL shaders:
- srgb:
Rendering in linear gamma:
If the hardware supports sRGB textures:
EGL: Use sRGB textures and present it as linear gamma to shaders.
If the hardware doesn't support sRGB textures:
Shader: Convert from sRGB to linear gamma.
- rgb:
Rendering in sRGB gamma:
Shader: Convert from linear to sRGB gamma.
- srgb_a:
Rendering in linear gamma:
Shader: Undo the alpha premultiplying, convert from sRGB to linear
gamma and premultiply the alpha after.
- srgba: This encoding requires hardware support for sRGB textures.
EGL: Use hardware sRGB textures and present it as linear gamma to shaders.
Rendering in sRGB gamma:
Shader: Convert from linear to sRGB gamma.
- rgba:
Rendering in sRGB gamma:
Shader: Undo the alpha premultiplying, convert from linear to sRGB
gamma and premultiply the alpha after.
This requires adding quite a few shader variants to Weston so a more
flexible way of generating them would be nice.
For the srgb encoding what EGL does depends on the rendering gamma of
Weston so EGL needs to be informed of it. I propose a EGL function
which sets an EGL_PREFERRED_GAMMA_WL attribute on an EGL display for
that.
Weston also needs to know how EGL will present it's encoding to the
shaders. For that I propose two new formats for
eglQueryWaylandBufferWL: EGL_TEXTURE_SRGB_WL and
EGL_TEXTURE_SRGB_A_WL. Here's how the encodings would map to
eglQueryWaylandBufferWL texture formats:
- srgb: If the hardware supports sRGB textures, EGL_TEXTURE_RGB
otherwise EGL_TEXTURE_SRGB_WL.
- rgb: EGL_TEXTURE_RGB.
- srgb_a: EGL_TEXTURE_SRGB_A_WL.
- srgba: EGL_TEXTURE_RGBA
- rgba: EGL_TEXTURE_RGBA
EGL clients don't interact with wl_drm directly and can't set the
encoding they would use. This would have to be done another way. For
that I propose wl_egl_window_set_encoding, which would take 3 possible
values:
- encoding_linear: Maps to rgb and rgba.
- encoding_srgb: Maps to srgb and srgba.
- encoding_srgb_linear_alpha: Maps to srgb and srgb_a. This is the
default value.
The most elegant way to add support for these encodings to wl_shm and
wl_drm without breaking backwards compatibility would be to create a
new wl_gamma_encoding extension which allows you to set the same
values as wl_egl_window_set_encoding on wl_buffers.
For EGL clients to actually render efficiently to srgba encodings it
requires support for sRGB framebuffers which my EGL extension proposal
exposes: http://lists.freedesktop.org/archives/mesa-dev/2012-September/027888.html
We may also want to optionally support full OpenGL in Weston since
there's a number of useful features we'd want for color management
there (64 bpp, sRGB and 3D textures and blits for opaque regions). I
don't see the big deal with the X dependency as long as it's optional.
We'll probably have xwayland long after libOpenGL comes along.
I'd like some comments on this plan, especially from krh on the EGL changes.
- John Kåre
More information about the wayland-devel
mailing list