[Mesa-dev] gl_NormalMatrix issue on Intel driver
tom fogal
tfogal at sci.utah.edu
Mon Oct 10 15:30:47 PDT 2011
One of our programs which relies on shaders heavily is having issues,
and I have tracked it down to unexpected values in gl_NormalMatrix
within the fragment shader.
Using the attached program (patch against mesademos repo) I did
printf-esque debugging to get the values of gl_NormalMatrix. Search
for 'fragShaderText' to see the shaders I'm using. I get different
results on a Mesa 7.10.2 (Ubuntu 11.04) system with an Intel card
versus an nvidia-binary-blob system. The nvidia-based system also
agrees with what I get using any card on a Mac and nvidia or ATI
cards on Windows (native drivers, not Mesa); we have no results for
Windows/Intel yet.
nvidia intel
[ 1.0 0.0 0.0 ] [ 1.0 0.0 0.0 ]
[ 0.0 -0.0 1.0 ] [ 0.0 0.0 >0.0, <0.0001 ]
[ 0.0 -1.0 -0.0 ] [ 1.0 15.0 0.0 ]
I used "-0.0" for a couple slots on the nvidia system; the value was
smaller than 0.0, but larger than 0-epsilon for some small epsilon,
similar to gl_NormalMatrix[1].z on intel but in the opposite direction.
I spot-checked gl_NormalMatrix[2].y with LIBGL_ALWAYS_SOFTWARE=1. In
that case, Mesa agrees with the nvidia driver: the value is -1.0. My
application also produces imagery identical (via a subjective eye test,
haven't tried image differencing the two) to the nvidia system when I
run it with LIBGL_ALWAYS_SOFTWARE=1.
On the intel system:
GL_VENDOR: Tungsten Graphics, Inc
GL_RENDERER: Mesa DRI Intel(R) Sandybridge Desktop GEM 20100330 DEVELOPMENT
GL_VERSION: 2.1 Mesa 7.10.2
More information about the mesa-dev
mailing list