[Spice-devel] Pixel format handling with offscreen surfaces

Izik Eidus ieidus at redhat.com
Wed Apr 14 16:12:36 PDT 2010


On Wed, 14 Apr 2010 16:12:21 +0200
Alexander Larsson <alexl at redhat.com> wrote:

> With the new support for offscreen surfaces has an issue that we did not
> have before has come up. It shows up particularly when using 32bit depth
> surfaces as sources to the alpha_blend operation. The problem is whether
> to treat the source bitmap as if it has an alpha channel or not. Both
> formats (called a8r8g8b8 and x8r8g8b8 in pixman) are essentially the
> same format for the colors, but when doing an alpha blend you either
> have to use the alpha bytes in the high byte, or ignore what is there
> and assume its 0xff everywhere. Both cases happen in applications,
> and its not possible to automatically detect which the app wants. In the
> windows driver we get this information via the AC_SRC_ALPHA flag, and
> for normal image sources this gets propagated to the image format
> (SPICE_BITMAP_FMT_32BIT vs SPICE_BITMAP_FMT_RGBA). However, for surface
> sources we know only the bit depth of the surface, and that's 32 for
> both cases.
> 
> Additionally, we can't specify the format to use when using a surface as
> source ahead of times. First of all, we're not told until we use the
> surface as a source that the alpha will be used. Furthermore, its quite
> possible to use the same surface as the source both with alpha and
> without alpha.
> 
> So, its clear to me that we somehow has to give information in each
> alpha_blend operation exactly how to interpret the bits. A single flag
> for "has_alpha" in SpiceSurface seems to be enough to solve this issue.
> 
> However, below the surface there lurks a more complex issue. That of
> pixel formats. Many of the rendering operations that gdi and therefore
> spice (the protocol is pretty much based on gdi) are what i call
> "bit specified", in that their definition does not really care much
> about how the bits in the surfaces are interpreted, just how many bits
> there are in each pixel. For instance, a XOR rop, a rectangle fill, a
> bit blit or a nearest neighbor scale do not care about what the
> pixels mean, they are just copied or bitwise combined.
> 
> There are however another type of operations that are "color
> specified". In spice these are currently limited to bilinear scaling
> and alpha blending. In doing these operations we need to know how each
> bit in the pixel maps to the 4 channels (r,g,b,a). One example of this
> is the rgb vs rgba case above. But there are other cases where the bit
> depth alone does not determine what colors the pixel are. For
> instance, 16bit depth on windows is typically 555 (i.e. 5 bits each
> for rgb) whereas on X (and in windows for some gfx cards) it is 565
> (i.e. 6 bits for green). There are also more "weird" uses of this, for
> instance i know gtk+ uses a Axxx + xRGB pixel format for the same
> 32bit surface to be able to do an alpha blending of non-premultiplied
> alpha argb (ARGB format is otherwise typically premultiplied in
> Xrender, like in spice). This is a kind of weird trick, but shows that
> the format uses for a surface source is not always obvious from the
> surface depth.
> 
> How to interpret a surface when its used as destination is also
> interesting. We have not historically had issues with this in spice
> for two reasons:
> 1) For the screen surface we're never interested in alpha, so the
>    rgb vs rgba 32bit issue is never there
> 2) We only support 555 format for 16bit screen surfaces, so we just
>    always assume that's the kind of format you want when drawing on
>    a 16bit surface.
> 
> But, if we want to support 565 16bit surfaces (including screen
> surface), which is important for an X driver we fail 2 above, and we
> already fail 1 with 32bit offscreen surfaces. So, we need to have a
> more generic solution to formats.
> 
> Right now there is a format enum used for bitmap images. It lets us do
> the right thing when using bitmaps as sources, although it has to be
> extended a bit with e.g. an 16bit 565 format. izik and I was at one
> point discussing moving the format of the bitmap to where its used,
> and have the bitmap itself only contain the depth, but that's not a
> good idea, because we need to know how to map the pixels to colors in
> the images to be able to compress them well on the network. So, I
> think we need to add a new format enum, something like:
> 
> enum {
>     SPICE_SURFACE_FMT_INVALID,
>     SPICE_SURFACE_FMT_1A,
>     SPICE_SURFACE_FMT_8A,
>     SPICE_SURFACE_FMT_16_555,
>     SPICE_SURFACE_FMT_16_565,
>     SPICE_SURFACE_FMT_32_xRGB,
>     SPICE_SURFACE_FMT_32_ARGB,
>     SPICE_SURFACE_FMT_32_Axxx,
> };

So this thing is attached for bitmaps as well?

> 
> And whatever other formats we need to add. (I don't think we want to
> support palette-based surfaces.)
> 
> Then we add a format member to SpiceSurface, so that we can specify
> how to interpret a surface when used as source. This way the win32
> driver can select SPICE_SURFACE_FMT_32_xRGB or
> SPICE_SURFACE_FMT_32_ARGB depending on the AC_SRC_ALPHA flag. And the
> xserver can do whatever it needs. Since the xserver is super flexible
> wrt how pixels are interpreted (user API allows you to pick any format
> based on shifts and mask specified per rgba component) this will mean
> some formats will have to be done "non-accelerated", but by supporting
> the actually used ones we'll get good performance anyway.
> 
> Then there is the question of destination surface format. For the
> simple case of rgb32 vs argb32 it would be "safe" and not a large
> performance cost to just always use argb32 for offscreen surfaces and
> rgb32 for the primary surface. However, this is not enough when it
> comes to e.g. 16bit surfaces. You really need to know if things are
> 555 or 565 when rendering. So, we need to specify the destination
> format somehow.
> 
> First I thought about adding a new destination format to each
> operation, but it feels like a bad idea for me. First of all most
> operations don't need it (since they are bit specified), and secondly
> its not something that in practice differs for each operation. As per
> above, it'll basically be argb32 for all 32bit destinations always
> (because you don't know when its safe to use "only" rgb32), and for
> win32 drivers 16bit surfaces will always be 555 and on X i think we
> should always make them 565 (and if some weird app uses 555 we'd
> handle that in software fallbacks in X).
> 
> So, for destination format i propose that we add a format to each
> surface that is set on construction, and is used when drawing *to* the
> surface. So, something like:
> 
> typedef struct SPICE_ATTR_PACKED SpiceMsgSurfaceCreate {
>     uint32_t surface_id;
>     uint32_t width;
>     uint32_t height;
>     uint8_t depth;
>     uint8_t format; <- new member
>     uint32_t flags;
>     uint32_t type;
> } SpiceMsgSurfaceCreate;
> 
> BTW, what is the type member here used for? Seems unused.

Well I think that I had in my mind to put it as PRIMARY / NOT_PRIMARY, but then
I used the flags.., yea i think we can just kick it...

All the above sound reasonble to me, When I used just the
SpiceMsgSurfaceCreate strcture, I didn`t know that X can use the
source surface in multiple formats for each time...

> 



More information about the Spice-devel mailing list