[Piglit] [Mesa-dev] [PATCH] piglit/general/texunits: fix to check the correct extension and be less pedantic

Ian Romanick idr at freedesktop.org
Thu Apr 15 11:22:13 PDT 2010


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Note: I'm moving this discussion to the piglit list.

Luca Barbieri wrote:
> First of all, according to http://www.opengl.org/registry/api/enumext.spec,
> the correct extension to check is ARB_vertex_shader.
> 
> Second, if that is not exposed, the OpenGL implementation can't tell us
> exactly what it supports, and thus some things that are allowed to fail
> could succeed instead.
> 
> I don't think we should hold that against them: extra functionality is good.
> 
> If the extension is exposed, we still check that errors are raised.
> 
> Even this is a bit dubious: sometimes extra features may be supported
> only partially, and thus not exposed, but still available in the
> right conditions.

NAK.  I test should either run or not run.  What you have here is two
tests.  One of the tests will run on some drivers, and the other test
will run on either drivers.  That's not good test design.

We already have a lot of tests like that, and they're frustrating beyond
all belief.

My suggestion is to get rid of the 'if (Pedantic)' business, and just
fix the extension tests.  See my comments below.

> ---
>  tests/general/texunits.c |   15 ++++++++++++---
>  1 files changed, 12 insertions(+), 3 deletions(-)
> 
> diff --git a/tests/general/texunits.c b/tests/general/texunits.c
> index 01a5e27..47c2540 100644
> --- a/tests/general/texunits.c
> +++ b/tests/general/texunits.c
> @@ -36,7 +36,7 @@
>  
>  static int Width = 128, Height = 128;
>  static int Automatic = 0;
> -
> +static GLboolean Pedantic = GL_FALSE;
>  
>  /** random number for checking state */
>  static GLfloat Random[128][4];
> @@ -145,6 +145,7 @@ test_rasterpos(void)
>     }
>  
>     /* this should generate an error */
> +   if(Pedantic)
>     {
>        GLfloat v[4];
>        glActiveTexture(GL_TEXTURE0 + MaxTextureCoordUnits);
> @@ -203,6 +204,7 @@ test_texture_matrix(void)
>     }
>  
>     /* this should generate an error */
> +   if(Pedantic)
>     {
>        GLfloat m[16];
>        glActiveTexture(GL_TEXTURE0 + MaxTextureCoordUnits);
> @@ -273,7 +275,7 @@ test_texture_params(void)
>     glActiveTexture(GL_TEXTURE0 + maxUnit);
>     /* INVALID_ENUM is expected */
>     err = glGetError();
> -   if (err != GL_INVALID_ENUM) {
> +   if (Pedantic && err != GL_INVALID_ENUM) {
>        printf("GL failed to raise GL_INVALID_ENUM setting texture unit\n");
>        return GL_FALSE;
>     }
> @@ -316,6 +318,7 @@ test_texture_env(void)
>     }
>  
>     /* this should generate an error */
> +   if(Pedantic)
>     {
>        glActiveTexture(GL_TEXTURE0 + MaxTextureImageUnits);
>        if (MaxTextureImageUnits == MaxTextureCombinedUnits) {
> @@ -402,13 +405,19 @@ key(unsigned char key, int x, int y)
>  static void
>  init(void)
>  {
> -   if (glutExtensionSupported("GL_ARB_fragment_program")) {
> +   if (glutExtensionSupported("GL_ARB_vertex_shader")) {

Neither version of the code is correct.  :(  I suspect the correct thing
is to check for either extension, since both add this query.  From the
GL_ARB_fragment_program spec:

    Accepted by the <pname> parameter of GetBooleanv, GetIntegerv,
    GetFloatv, and GetDoublev:

        ...

        MAX_TEXTURE_COORDS_ARB                          0x8871

> +      /* a bit dubious: OpenGL may want to give us freebies that can't always
> +       * be supported */
> +      Pedantic = GL_TRUE;
>        glGetIntegerv(GL_MAX_TEXTURE_COORDS, &MaxTextureCoordUnits);
>        glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS, &MaxTextureImageUnits);
>        glGetIntegerv(GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, &MaxTextureVertexUnits);
>        glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &MaxTextureCombinedUnits);

These queries, however, should be protected by a check for
GL_ARB_vertex_shader.


>     }
>     else {
> +      /* in this case, the OpenGL implementation may support higher values for
> +       * some of these, and allow them: don't hold it against them... */
> +      Pedantic = GL_FALSE;
>        glGetIntegerv(GL_MAX_TEXTURE_UNITS, &MaxTextureCoordUnits);
>        MaxTextureImageUnits =
>        MaxTextureVertexUnits =

On which drivers do you see this?  I don't think it's correct.  If
completely non-programmable GPU advertises N texture units, the spec is
pretty clear that using units beyond N-1 is an error.  I'm not aware of
any ambiguity in the spec language here.  Or does this only occur when
GL_ARB_vertex_shader is not supported but GL_ARB_fragment_program is?
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAkvHWVIACgkQX1gOwKyEAw+zZACgh8EdrYk2O2RxF0Co1fXsQaDj
p+oAn14uGkMJZ/+H+HyOFcYSV+wuFgfx
=igZb
-----END PGP SIGNATURE-----


More information about the Piglit mailing list