[Mesa-dev] [Bug 111141] [REGRESSION] [BISECTED] [DXVK] 1-bit booleans and Elite Dangerous shader mis-optimization

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Thu Jul 18 08:52:45 UTC 2019


https://bugs.freedesktop.org/show_bug.cgi?id=111141

--- Comment #8 from Steven Newbury <s_j_newbury at yahoo.co.uk> ---
So presumably it's the optimization for AMD?

I had a good look through the code but I'm not sufficiently clear as to how it
all works to really know where the bug might be.

My current understanding, please correct me if I'm wrong:

The game is shipped with HLSL shaders compiled to DXBC

DXVK converts those DXBC -> SPIR-V [D3D int32_t booleans are converted to
SPIR-V boolean type] 

(At this point all must be okay since it worked before and still works with
Intel, except that Intel has a different internal representation...)

SPIR-V -> NIR [SPIR-V booleans are converted to int1_t]

NIR -> GPU HW Shader [AMD Scaler booleans; Intel 1+31bit booleans]

(What happens if the booleans are part of a struct and the code assumes they're
32-bit during the above passes?  Previously, NIR used D3D compatible booleans
so it would just work, Intel is 32bit so maybe it all falls back into place?)

Is the above nonsense?

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are the QA Contact for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/mesa-dev/attachments/20190718/308ba0f2/attachment.html>


More information about the mesa-dev mailing list