Gamma correct rendering with Wayland and Weston
John Kåre Alsaker
john.kare.alsaker at gmail.com
Sat Sep 29 01:53:35 PDT 2012
On Fri, Sep 28, 2012 at 5:52 PM, John Kåre Alsaker
<john.kare.alsaker at gmail.com> wrote:
> - srgb_a:
> Rendering in linear gamma:
> Shader: Undo the alpha premultiplying, convert from sRGB to linear
> gamma and premultiply the alpha after.
When rendering srgb_a in linear gamma, we can use sRGB hardware
texture support for opaque regions. This means that the EGL interface
needs some changes and that Weston needs to be able to toggle the sRGB
decoding. I see to options for doing this, and I think I'm favoring
the first one:
- Give Weston two EGL images, one which decodes sRGB, one which
doesn't. A new attribute for eglCreateImageKHR could be added for
this. This would also allow EGL to expose the encodings more directly
to Weston and let it pick when to use hardware sRGB decoding and EGL
wouldn't need to know about blending space of Weston.
- Use the existing GL_EXT_texture_sRGB_decode extension. When the
hardware supports sRGB decoding it would return EGL_TEXTURE_RGB_A_WL
instead of EGL_TEXTURE_SRGB_A_WL.
> - srgba: This encoding requires hardware support for sRGB textures.
> EGL: Use hardware sRGB textures and present it as linear gamma to shaders.
> Rendering in sRGB gamma:
> Shader: Convert from linear to sRGB gamma.
The shader conversion here would actually be the same as with the rgba encoding.
> - rgba:
> Rendering in sRGB gamma:
> Shader: Undo the alpha premultiplying, convert from linear to sRGB
> gamma and premultiply the alpha after.
More information about the wayland-devel
mailing list