Getting anti-aliasing right

jonsmirl at gmail.com jonsmirl at gmail.com
Thu Nov 18 04:51:17 PST 2010


On Wed, Nov 17, 2010 at 11:04 PM, Christopher James Halse Rogers
<christopher.halse.rogers at canonical.com> wrote:
> On Wed, 2010-11-17 at 00:37 -0500, jonsmirl at gmail.com wrote:
>> On Tue, Nov 16, 2010 at 10:28 PM, Corbin Simpson
>> <mostawesomedude at gmail.com> wrote:
>> > I should have been clearer. This is like compiz, where non-transformed
>> > viewports look perfect and transformed viewports look illegible. Nobody
>> > appears to have a problem with this for compiz; why should Wayland have
>> > anything different?
>>
>> If Wayland ends up as our graphics foundation it may be with us for 20
>> years.  This is a known problem and it can be fixed with some
>> planning.  Isn't the goal of Wayland to have pixel perfect graphics?
>>
>> Glyph generation inside the CPU is certainly going to happen in that
>> time frame. You can already do it today on decent GPUs. We are very
>> close to be able to run the entire compositor and scene graph engine
>> inside the GPU.  You want to plan for these things in the architecture
>> so that we don't have to rip everything up again in five years.
>> Today's high end GPU will be the low end GPU of 2015.
>
> GPU-based glyph generation is entirely orthogonal to preserving optimal
> anti-aliasing under window transformations.  You can do GPU-based glyph
> rendering now, in X (I seem to recall Zack Rusin blogging about it a
> couple of years ago), and you can do optimal anti-aliasing under
> transformations without GPU-based glyph generation.
>
> I'm not sure that you *can* solve the optimal anti-aliasing problem
> without introducing deep knowledge of application rendering into
> Wayland, something which is an explicit anti-goal.

I can see two strategies, we may need both:
1) Wayland tells the app the window transform and the pixel layout of
the screen. The app does antialiasing with this information. This
works because Wayland is not going to transform the output buffer, the
app has already transformed it.

This is complicated because now an app has to be able to draw and
antialias onto an arbitrarily
transformed surface. It also has to understand the pixel layout.
Direct GL apps will probably need this model.

2) Apps don't draw, instead they make a scene graph. As Wayland
constructs the screen it then plays this scene graph into a rendering
engine. The rendering engine is aware of the window transform and
pixel layout. This allows desktop animations without app intervention.

The common thread here is that Wayland needs to inform the rendering
engine as to what the final window transform is going to be. Then the
rendering engine will draw and anti-alias using that transform. These
transforms can be complex 3D operations so a depth buffer is needed.

This means Wayland is out of the transformation business. The
rendering engines will have already transformed the windows.  Wayland
will just composite those transformed windows. All of the clipping
will sort out because of the depth buffers.

--------------------

GPU glyph generation does seem to be orthogonal.

My objection to X is that the protocol forces you to generate glyph
bitmaps in the app and send them to the server. The server then
maintains a list of these glyph images and does a brute force compare
of bitmaps on the list every time a new glyph image is received. The
knowledge of which glyph generated the bitmap has been lost.

The protocol was done this way to help network transparency. If the
fonts are stored in the server the first thing the app has to is
retrieve them all since it is the app that determines glyph spacing. X
initially had the fonts in the server. Performance was bad so the
fonts were moved into the client. The outcome of that was transmitting
the bitmaps to the server.

I don't want to build assumptions into the protocol that prevent GPU
glyph generation. But this does seem to be an issue for the rendering
engines, not Wayland.

-------------------

I'm trying to come up with proposals for these problems as I write
this. AFAIK none of this has been addressed in a windowing system
before. Proposals will need to be built and tested to determine which
solutions work.

A key observation seem to be that Wayland needs to tell the apps what
the final window transformation is, and the app needs to provide
transformed windows including depth buffers. If you don't do that it
is impossible to get the anti-aliasing right.


-- 
Jon Smirl
jonsmirl at gmail.com


More information about the wayland-devel mailing list