Depth 30 enablement for ati-ddx + exa. Rev 3

Mario Kleiner mario.kleiner.de at gmail.com
Wed Jan 24 17:35:08 UTC 2018


On 01/22/2018 07:01 PM, Michel Dänzer wrote:
> On 2018-01-22 03:14 AM, Mario Kleiner wrote:
>> Ok, 3rd revision, now with per-x-screen drmmode_crtc_funcs rec
>> and set_gamma = NULL in the depth 30 case. Also back to Fredrik's
>> original exa 10 bit patch, just with his signed-off tacked on.
>>
>> Tested with single and dual x-screen, depth 24, depth 30 and mixed
>> 24 and 30 on separate x-screens. Also tested against current tip
>> of ati-ddx master.
> 
> Thanks for the thorough testing.
> 
> The patches look mostly good now, apart from some cosmetic issues (lines
> shouldn't be unnecessarily shorter than 72 columns, and acronyms should
> be spelled in all upper case, in the commit logs), but I can live with
> those.
> 
> However, I gave the patches a quick spin (without a 30-bit capable
> monitor though) on the Turks card that's currently sitting in my
> development machine, and I noticed that Xorg crashes on shutdown because
> the XvMC extension failed to initialize. Can you take a look?

It's missing a definition to allow init on a depth 30 screen. I've fixed 
that in a patch i just sent out, so now the extension initializes. It 
failed here as well, but never resulted in a crash for me with my 
servers up to 1.19.5. You fixed a bug in the server causing the crash, 
but maybe it makes still sense for older unfixed servers to prevent 
trouble by initializing the extension, or instead disabling it on a 
depth 30 screen.

With that patch it exposes support, but then the two apps i could get to 
use XvMC at all with .mpg movies (mplayer and xine) fail to display on 
depth 30 for reasons that seem to be unrelated to XvMC.

The tests for mesa's gallium/state_trackers/xvmc/test sort of work 
though, although with some false color. After many hours of poking 
around, all i know is that i don't understand how XvMC is wired up?

Having the extension enabled in the ddx is needed for any test to work. 
But when i set a breakpoint on EVERGREENDisplayTexturedVideo() in
evergreen_textured_videofuncs.c, that breakpoint is never hit, even in 
depth 24 when XvMC tests work, so i'm not sure if i'm looking at mostly 
unused code there?

> 
> 
>> When testing against current master in dual-x-screen mode,
>> i found out that the current ati-ddx commit at the top 1fe8ca75974c52
>> "Keep track of how many SW cursors are visible on each screen"
>> breaks multi x-screen setups by screwing up cursor handling.
> 
> I stumbled over this myself the other day. There's an issue with the
> (un)wrapping of miPointerSpriteFuncRec entries. Not sure yet what
> exactly's wrong with it though. I'll look into it.
> 

Great, thanks.

> 
>> I am also currently looking into broken pageflipping with the ati-ddx
>> under DRI2. This seems to be due to recent changes in the drmAddFB()
>> calls to no longer pass in the depth/bpp of the x-screens root window
>> pixmap, but instead the depth/bpp of the pixmap that should be
>> flipped onto the scanout. On DRI3 this behaves as i'd expect, but
>> on DRI2 the passed in pixmaps don't seem to have depth 24 on a depth
>> 24 screen or depth 30 on a depth 30 screen, like the root windows,
>> but instead the pixmaps come in at depth 32. That leads the pagelip
>> ioctl to fail, with the kernel complaining that "Page flip is not
>> allowed to change frame buffer format."
>>
>> If i hack mesa gallium/st's dri2_drawable_get_buffers() to avoid
>> use of the DRI2GetBuffersWithFormat() request and instead use the
>> older DRI2GetBuffers() request, then pageflips work again, as the
>> old request derives the buffers depth from the glx drawables depth,
>> which is 24 or 30 for a depth 24 or depth 30 x-screen, so a better
>> match.
> 
> Hmm. Can you track down where the "format" value in
> radeon_dri2_create_buffer2 comes from? It's supposed to be the depth.

It is the depth that comes from gallium/state_trackers/dri/dri2.c: 
dri2_drawable_get_buffers(), where the switch-case statement translates 
pipe_format into depth values for the DRI2GetBuffersWithFormat request 
under DRI2 (originally introduced in commit 576161289 to transmit depth 
instead of bpp). We have a case statement that translates RGBA8888 and 
BGRA1010102 formats into depth 32.

It only happens if a client wants a fbconfig with alpha channel, for 
destination alpha blending etc., as my application happens to. If you 
only run glxgears or a desktop compositor like kde-5's, it won't happen 
as they only use bgrx configs without alpha.

-mario


More information about the amd-gfx mailing list