[Intel-gfx] [PATCH] BDW swizzling

Ben Widawsky benjamin.widawsky at intel.com
Fri Apr 11 00:50:35 CEST 2014


On Thu, Apr 10, 2014 at 06:51:50PM +0100, Damien Lespiau wrote:
> On Thu, Apr 10, 2014 at 10:32:46AM -0700, Ben Widawsky wrote:
> > On Thu, Apr 10, 2014 at 05:24:07PM +0100, Damien Lespiau wrote:
> > > While cruising through the specs, I noticed a note about swizzling changes on
> > > BDW. My understanding is that we don't need to either initialize swizzling on
> > > the GPU side nor swizzle the address ourselves on the CPU side.
> > > 
> > > That could be totally wrong though, and I unfortunately don't have a machine to
> > > test this theory on.
> > 
> > I fought with this too. My resolution was we can either set all the
> > swizzling bits, or set none. There is no motivation to do either, and
> > the spec simply is telling us what they do for windows. That was well
> > over a year ago, so it all can be different now.
> 
> My (limited) understanding is telling me that if we don't return
> I915_BIT_6_SWIZZLE_NONE, user space is going to swizzle the address and
> the controller is going to swizzle it again, even from the CPU, so we
> end up at the wrong place.

I thought this type of swizzling was usually determined by our code
i915_gem_detect_bit_6_swizzle(). I am also unfamiliar with whether or
not userspace controls it, and whether or not it should. I never really
looked at that aspect of the code. Maybe Chris can answer this part.

> 
> > I honestly don't care what we do though, so long as the patches get
> > tested both in simulation and silicon, and there is no measurable perf
> > drop. I suppose my mild preference is to, "don't touch it if it ain't
> > broke."
> > 
> > Sorry, but at the moment, I don't have time to test this for you. Maybe
> > someone else can, or remind me in a couple of weeks.
> 
> Do you know if you have a configuration where we try to swizzle? If yes
> and tests/gem_tiled_pread is passing that would give us a nice bit of
> information. (which of course can be tried by the next person with time
> to do so).
> 

If you get it wrong, it looks really obvious. Swizzling is *supposed* to
be one of those transparent things (I thought). What follows can be
entirely wrong, it's mostly from memory and a brief conversation with
Art.

There are 3 places that care about swizzling:
1. The memory/DRAM controller
2. The displayer interface to memory
3. The GAM arbiter (generic interface to memory)

It may or may not be talking about the same type of swizzling (bit) in
all cases. The important thing, and what I have observed, is that the
GAM and DE match on how things are swizzled. Otherwise we render/blit to
a surface and it gets [de]swizzled when it's displayed. I never measured
performance for setting both to 0, instead of 1.

The part that's confused me has always been why we are supposed to
program it based on #1. The way the DRAM controller decides to lay out
the physical rows/banks etc. shouldn't matter as long as everyone goes
through the same DRAM controller. It should just be transparent linear
RAM. In other words, the comment about how we need to program the
swizzle based on the DRAM controller never quite made sense to me. It's
also possible if you enable one, you shouldn't/should enable another
since compounding swizzling may be self-defeating. Dunno - so maybe your
patch helps, maybe it hurts.

Art suggested that the swizzling in GAM and DE predate the DRAM
swizzler.

-- 
Ben Widawsky, Intel Open Source Technology Center



More information about the Intel-gfx mailing list