Getting anti-aliasing right
jonsmirl at gmail.com
jonsmirl at gmail.com
Tue Nov 16 21:37:34 PST 2010
On Tue, Nov 16, 2010 at 10:28 PM, Corbin Simpson
<mostawesomedude at gmail.com> wrote:
> I should have been clearer. This is like compiz, where non-transformed
> viewports look perfect and transformed viewports look illegible. Nobody
> appears to have a problem with this for compiz; why should Wayland have
> anything different?
If Wayland ends up as our graphics foundation it may be with us for 20
years. This is a known problem and it can be fixed with some
planning. Isn't the goal of Wayland to have pixel perfect graphics?
Glyph generation inside the CPU is certainly going to happen in that
time frame. You can already do it today on decent GPUs. We are very
close to be able to run the entire compositor and scene graph engine
inside the GPU. You want to plan for these things in the architecture
so that we don't have to rip everything up again in five years.
Today's high end GPU will be the low end GPU of 2015.
>
> Sending from a mobile, pardon the brevity. ~ C.
>
> On Nov 16, 2010 7:18 PM, "Christopher James Halse Rogers"
> <christopher.halse.rogers at canonical.com> wrote:
>> On Tue, 2010-11-16 at 18:30 -0800, Corbin Simpson wrote:
>>> On Tue, Nov 16, 2010 at 6:14 PM, jonsmirl at gmail.com <jonsmirl at gmail.com>
>>> wrote:
>>> > What is the plan for anti-aliasing? If an app draws a beautiful
>>> > anti-aliased scene to a buffer, and then that buffer is transformed by
>>> > the compositor, all of that beautiful anti-aliasing is going to get
>>> > messed up. Especially sub-pixel antialiasing if the compositor
>>> > transform changes the alignment of the screen with the LCD pixels.
>>> >
>>> > One approach is to tell the app the final window transform and let
>>> > them draw a transformed screen instead of a rectangle that gets
>>> > transformed by the compositor. But this means an app need to be able
>>> > to handle being told to project itself onto a rotating sphere.
>>> >
>>> > Another approach is for apps to create a screen graph out of
>>> > primitives that some other system renders when it know the final
>>> > transform. This method lets the compositor implement animations
>>> > without involving the app.
>>> >
>>> > This problem is closely tied into glyph generation. Sooner or later
>>> > we're going to be generating glyphs on the GPU in real-time. If we're
>>> > doing that then you don't want anti-aliased glyphs being generated
>>> > inside the apps like we do today. Apps will still need to compute
>>> > glyph spacing and pick a spacing as a result of hinting, but the
>>> > compositor may want to regenerate the glyphs if the window is shrunk,
>>> > zoomed, twisted, etc.
>>> >
>>> > I'm sure there are other ways to handle anti-aliasing. If the goal is
>>> > to make pixel perfect images every time then anti-aliasing strategy is
>>> > something that needs to be planned.
>>>
>>> What's the difference between Wayland and X11 here, exactly? We
>>> *already* draw anti-aliased subpixel-specific font glyphs on GPUs and
>>> it looks fine.
>>>
>>
>> - Unless your compositor is transforming the buffer before scanout, at
>> which point your calculated antialiasing is no longer optimal. Say
>> you've calculated subpixel AA with an RGB subpixel order, but the
>> compositor flips your buffer. Or if the window buffer is being shrunk
>> the fonts might be better rendered with a different algorithm
>> altogether.
>>
>> I think the point here is that if you treat a window containing text as
>> just an image you'll get less than optimal anti-aliasing under window
>> transformations.
>>
>> This seems to be a special case of the more general observation that if
>> applications knew the transformations applied to their surfaces then
>> they could potentially perform higher quality rendering. I don't think
>> the general case is feasible to accommodate, and I'd suspect that the
>> font-rendering case won't be significantly easier.
>
--
Jon Smirl
jonsmirl at gmail.com
More information about the wayland-devel
mailing list