[PATCH] dix: Reshuffle ScreenRec to pack holes

Jamey Sharp jamey at minilop.net
Wed May 19 12:23:33 PDT 2010


On Wed, May 19, 2010 at 11:31 AM, Adam Jackson <ajax at redhat.com> wrote:
> This eliminates the unused 'id' field (what the eff), and reorders the
> other data fields to fit packed on both ILP32 and LP64.  ScreenRec
> shrinks by 64 bytes on LP64 (less on ILP32), and devPrivate just
> barely squeaks into the first cacheline on LP64 as a bonus.

Spiffy!

>  typedef struct _Screen {
>     int                        myNum;  /* index of this instance in Screens[] */
> -    ATOM               id;
> +    unsigned int       rgf;    /* array of flags; she's -- HUNGARIAN */

Wow, I just wondered what that was, and the answer is dumb. I'm not
sure it should be moved away from GCperDepth. At least it's safe to
shrink it to int, or even short, as long as MAXFORMATS continues to be
8. In fact, I think you can legally declare it "unsigned rgf :
MAXFORMATS + 1;", and throw it in the unused_flags if you want.
Alternatively, use one of the unused flag bits in each GC to mark
whether it's free for reuse, and drop this rgf field entirely.

How awful would it be to kill the GCperDepth pool entirely, and let
GetScratchGC always just delegate to CreateScratchGC?

> +    unsigned long      whitePixel, blackPixel;
> +    unsigned int       rootVisual;
> +    unsigned int       defColormap;

Seems to me these should be declared Pixel, VisualID, and Colormap,
respectively, which are all CARD32. That should save another 8 bytes
on 64-bit, I think?

> +    unsigned char              rootDepth;
> +    unsigned           backingStoreSupport:1;
> +    unsigned           saveUnderSupport:1;
> +    unsigned           unused_flags:22;

Why is this a 24-bit bitfield? Is GCC packing it together with the
1-byte rootDepth?

Jamey


More information about the xorg-devel mailing list