[Piglit] [PATCH 1/8] layered-rendering/blit: use color other than the default red
Topi Pohjolainen
topi.pohjolainen at intel.com
Sun Jan 26 01:34:03 PST 2014
Passes on nvidia and on IVB using blt-engine and mesa fallback
path but fails on IVB using blorp (both before and after the
recent refactoring):
Probe color at (32,0)
Expected: 0.500000 0.400000 0.300000
Observed: 0.250980 0.160784 0.090196
Probe color at (96,0)
Expected: 0.500000 0.400000 0.300000
Observed: 0.250980 0.160784 0.090196
Probe color at (32,64)
Expected: 0.500000 0.400000 0.300000
Observed: 0.250980 0.160784 0.090196
Probe color at (96,64)
Expected: 0.500000 0.400000 0.300000
Observed: 0.250980 0.160784 0.090196
Some observations:
1) On the blt-engine and fallback path the test also produces the
following errors:
Mesa: User error: GL_INVALID_ENUM in glEnable(GL_TEXTURE_RECTANGLE)
Mesa: User error: GL_INVALID_ENUM in glEnable(GL_TEXTURE_RECTANGLE)
Mesa: User error: GL_INVALID_ENUM in glEnable(GL_TEXTURE_RECTANGLE)
Mesa: User error: GL_INVALID_ENUM in glEnable(GL_TEXTURE_RECTANGLE)
Unexpected GL error: GL_INVALID_ENUM 0x500
(Error at /home/tpohjola/work/piglit/tests/spec/gl-3.2/layered-rendering/blit.c:137)
Unexpected GL error: GL_INVALID_ENUM 0x500
(Error at /home/tpohjola/work/piglit/tests/spec/gl-3.2/layered-rendering/blit.c:137)
Unexpected GL error: GL_INVALID_ENUM 0x500
(Error at /home/tpohjola/work/piglit/tests/spec/gl-3.2/layered-rendering/blit.c:137)
Unexpected GL error: GL_INVALID_ENUM 0x500
(Error at /home/tpohjola/work/piglit/tests/spec/gl-3.2/layered-rendering/blit.c:137)
These are skipped by the test as the return value of
display_texture() is ignored. I didn't pay much attention to this
as it seemed unrelated to the original problem. I may me mistaken
though.
2) If I modify the test to use RGBA formatted textures than even
the blorp passes.
3) The sampling of the XRGB seems to be fine, I tried hardcoding the
0.5, 0.4, 0.3, 1.0 to the render targer write message payload
instead of copying the data from the sample message writeback and
I get the exact same failure.
I've been trying to look at the R0/R1 settings that according to
the spec are re-used for the RT, PRM:
"If the header is not present, behavior is as if the message was
sent with most fields set to the same value that was delivered
in R0 and R1 on the pixel shader thread dispatch. The following
fields, which are not delivered in the pixel shader dispatch,
behave as if they are set to zero:
Render Target Index
Source0 Alpha Present to Render Targe"
That still leaves, for example, the color calculator pointer
settings to have an effect.
Anyway, I keep on trying to understand this better, but I thought
better posting the findings so far.
CC: Paul Berry <stereotype441 at gmail.com>
Signed-off-by: Topi Pohjolainen <topi.pohjolainen at intel.com>
---
tests/spec/gl-3.2/layered-rendering/blit.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/tests/spec/gl-3.2/layered-rendering/blit.c b/tests/spec/gl-3.2/layered-rendering/blit.c
index 162eee8..144be45 100644
--- a/tests/spec/gl-3.2/layered-rendering/blit.c
+++ b/tests/spec/gl-3.2/layered-rendering/blit.c
@@ -70,7 +70,7 @@ const int texelsPerLayer = 32 * 32;
const int floatPerLayer = 32 * 32 * 3;
static const float srcColors[2][3] = {
- {1, 0, 0}, {0, 1, 0}
+ {0.5, 0.4, 0.3}, {0, 1, 0}
};
static const float dstColors[2][3] = {
--
1.8.3.1
More information about the Piglit
mailing list