[Piglit] [PATCH 13/13] gles-3.0: Add minmax test

Chad Versace chad.versace at linux.intel.com
Wed Nov 28 14:09:46 PST 2012


On 11/28/2012 12:52 PM, Eric Anholt wrote:
> Chad Versace <chad.versace at linux.intel.com> writes:
> 
>> This is the first GLES3 test.
>>
>> Tested against Mesa gles3-861a99d with Intel Sandybridge. All testcases
>> pass except:
>>
>>   token                         min          actual
>>   GL_MAX_ELEMENT_INDEX          16777215     16384
> 
> That sounds like a fun test to add.

Yeah... I'm can't say why, but I'm not looking forward to fixing that one.



>> Currently, all.tests contains the GL tests, all_es1.tests contains the
>> GLES1 tests, and all_es2.tests contains the GLES2 tests. This is the first
>> GLES3 test, so I added the file all_es3.tests. We should really clean that
>> up and put all tests in all.tests.
> 
> Any reason not to, given that now if the binaries aren't built they'll
> just be skipped?

I had the same thoughts. I plan on doing it after this series lands.


>> +void
>> +piglit_init(int argc, char **argv)
>> +{
>> +	GLint v_blocks;
>> +	GLint v_uniforms;
>> +	GLint f_blocks;
>> +	GLint f_uniforms;
>> +	GLint64 blocksize;
>> +
>> +	piglit_print_minmax_header();
>> +
>> +	glGetIntegerv(GL_MAX_VERTEX_UNIFORM_BLOCKS, &v_blocks);
>> +	glGetIntegerv(GL_MAX_VERTEX_UNIFORM_COMPONENTS, &v_uniforms);
>> +	glGetIntegerv(GL_MAX_FRAGMENT_UNIFORM_BLOCKS, &f_blocks);
>> +	glGetIntegerv(GL_MAX_FRAGMENT_UNIFORM_COMPONENTS, &f_uniforms);
>> +	glGetInteger64v(GL_MAX_UNIFORM_BLOCK_SIZE, &blocksize);
>> +
>> +	/* Table 6.27 */
>> +	piglit_test_min_int64(GL_MAX_ELEMENT_INDEX, (1 << 24) - 1);
>> +	piglit_test_min_int(GL_SUBPIXEL_BITS, 4);
>> +	piglit_test_min_int(GL_MAX_3D_TEXTURE_SIZE, 256);
>> +	piglit_test_min_int(GL_MAX_TEXTURE_SIZE, 2048);
>> +	piglit_test_min_int(GL_MAX_ARRAY_TEXTURE_LAYERS, 256);
>> +	piglit_test_min_float(GL_MAX_TEXTURE_LOD_BIAS, 2.0);
>> +	piglit_test_min_int(GL_MAX_CUBE_MAP_TEXTURE_SIZE, 2048);
>> +	piglit_test_min_int(GL_MAX_RENDERBUFFER_SIZE, 2048);
>> +	piglit_test_min_int(GL_MAX_DRAW_BUFFERS, 4);
>> +	piglit_test_min_int(GL_MAX_COLOR_ATTACHMENTS, 4);
> 
> No testing for viewport dims here.  Can we just use the same code from
> desktop gl?

Ah, I somehow missed that. I'll add it.

>> +	piglit_test_range_float(GL_ALIASED_POINT_SIZE_RANGE, 1, 1);
>> +	piglit_test_range_float(GL_ALIASED_LINE_WIDTH_RANGE, 1, 1);
> 
>> +	/* Table 6.28 */
>> +	piglit_test_min_int(GL_NUM_COMPRESSED_TEXTURE_FORMATS, 10);
> 
> Might as well also check the NUM_PROGRAM_BINARY_FORMATS,
> NUM_SHADER_BINARY_FORMATS, and MAX_SERVER_WAIT_TIMEOUT here as well --
> even though the min is 0, it will make sure we allow getting the value.
> The minmax tests have caught corner cases like this in the past :)

Sure enough, GL_MAX_SERVER_WAIT_TIMEOUT is -1. But the FORMATS are correctly 0.



More information about the Piglit mailing list