[Mesa-dev] [MR] glsl: Use default precision on struct/interface members when nothing specified

Neil Roberts nroberts at igalia.com
Thu Apr 25 13:42:13 UTC 2019


Mesa already keeps track of the GLES precision for variables and stores
it in the ir_variable. When no precision is explicitly specified it
takes the default precision for the corresponding type. However, when
the variable is a struct or interface, the precision of each individual
member is attached to the glsl_type instead. The code to make it use the
default precision was missing so this branch adds it in.

Only the last patch actually makes this change. The rest of the patches
are to fix regressions in Piglit and CTS. The underlying problem is that
Mesa was considering types that have different precisions to be
different when comparing interstage interfaces (varyings and UBOs).
According to the spec it should be ignored. Presumably this problem
already existed if mismatched precisions were explicitly specified but
we didn’t have any tests to test it. Storing the default precision makes
some tests fail because the default precision for ints is different in
the vertex and fragment stages so it’s easier to accidentally make a
test case that tests this.

The tests that regressed are:

dEQP-GLES31.functional.shaders.opaque_type_indexing.* (12 tests)
piglit.spec.ext_transform_feedback.structs_gles3 basic-struct run
piglit.spec.glsl-es-3_00.execution.varying-struct-centroid_gles3
piglit.spec.ext_transform_feedback.structs_gles3 basic-struct get

https://gitlab.freedesktop.org/mesa/mesa/merge_requests/736
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 832 bytes
Desc: not available
URL: <https://lists.freedesktop.org/archives/mesa-dev/attachments/20190425/7a046c63/attachment.sig>


More information about the mesa-dev mailing list