[Pixman] [PATCH 7/7] utils.c: Increase acceptable deviation to 0.0064 in pixel_checker_t

Bill Spitzak spitzak at gmail.com
Wed Mar 6 20:57:49 PST 2013


On 03/06/2013 08:35 PM, Søren Sandmann wrote:
> Bill Spitzak <spitzak at gmail.com> writes:
>
>> Søren Sandmann wrote:

> No, I'm talking about keeping internal buffer as 32 bit pixels, and then
> dithering the final conversion to 565, as opposed to having 16 bit
> internal buffers and dithering gradients to that before compositing.

Yes that will be even better, sorry I misunderstood.

>> Dithering involves adding the pattern to the source, such that a pixel
>> of 5.5 is much more likely to end up >= 6 than a pixel of 5.0, and
>> thus on average will be brighter.
>
> It won't be brighter if the dither signal is 0 on average.

What I meant is that if you took all the pixels that resulted from 5.5 
they would average to brighter than the ones that resulted from 5.0.

Most work I have done with dithering has a positive-only signal but uses 
floor rather than round to turn the result into an integer. So I tend to 
think of it as making pixels brighter. I think they are mathematically 
equivalent and one can be converted to the other.

> A good blue-noise pattern is very close in quality to error diffusion.

I agree the difference is imperceptible on 8 bit channels, and combined 
with the fact that a GPU shader can do it pretty much means a pattern 
will always be used today.



More information about the Pixman mailing list