[Piglit] [PATCH] max-texture-size: test non-proxy targets with max size from proxy test

Ian Romanick idr at freedesktop.org
Wed Feb 19 00:46:49 CET 2014


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 02/16/2014 03:31 PM, Kenneth Graunke wrote:
> On 02/11/2014 07:59 AM, Brian Paul wrote:
>> On 02/10/2014 07:49 PM, Ian Romanick wrote:
>>> On 02/07/2014 03:34 PM, Brian Paul wrote:
>>>> Save the max texture size found with the proxy targets.  Then
>>>> use that max size when we test the regular/non-proxy targets
>>>> with glTexImage and glTexSubImage().
>>>> 
>>>> The whole point of proxy textures is to be able to probe the
>>>> maximum texture size.  So let's use that size when we try the
>>>> real textures. That's what an application would typically
>>>> do.
> 
> Right now, the proxy cases are split out as separate subtests...but
> they don't really test much...just that you don't get a GL error.
> I suppose they could also check that the value obtained via proxy
> textures is <= the advertised maximum.  Not sure how valuable that
> would be.
> 
> But maybe we should drop the "test" aspect of the proxy texture
> code and just use it as a mechanism to figure out what size to try
> in the "real" tests...
> 
>>>> As it was, most of the GL_TEXTURE_3D tests were returning
>>>> 'skip' results because we couldn't allocate a 2048^3 or
>>>> 1024^3 texture. Now we should get pass/fail/crash when we try
>>>> creating an N^3 texture when OpenGL told us that N should
>>>> work.
>>> 
>>> Which hardware, if any, have you tried this on?  Any closed
>>> source drivers?
>> 
>> Yes, NVIDIA's driver.  The test behaves the same way before and
>> after this change.  NVIDIA's proxy texture tests always pass for
>> the max advertised texture size.  Ex: it happily says a 2048^3 x
>> RGBA32F 3D texture is doable.  But then our call to calloc()
>> fails the test just reports 'skip'.
> 
> I'm not clear whether we actually want to calloc data to pass to 
> TexImage.  It seems like passing NULL ought to be sufficient, since
> the driver should still allocate storage for it (so it could be
> populated via rendering)...but maybe that could be deferred.

That's a really good suggestion.  Doing a glTexImage2D with NULL
pixels followed by a glTexSubImage2D of a single pixel in the middle
of the texture should do the trick.

> It seems like calloc'ing a huge amount of data just increases the 
> likelihood of GL_OUT_OF_MEMORY errors....
> 
>> But I've also found that this test (NVIDIA) is sensitive to
>> whatever else might be running.  In my first run I also had
>> several VMs running on my system (using a fair amount of RAM and
>> VRAM) and max-texture-size hung my system when it was testing a
>> 16384 RGBA32F cube map.
>> 
>> I'm curious what AMD's driver does.
>> 
>> -Brian
> 
> Whatever you want to do here is probably fine.  I was mostly
> working on it because I needed to raise our driver's limit to make
> an application work, and discovered there was no way I could pass
> the piglit test as written...even though there wasn't a bug in my
> driver.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.14 (GNU/Linux)

iEYEARECAAYFAlMD8OkACgkQX1gOwKyEAw8ydACdHNTvvoXIOopWkrkd0qGZIqqs
YCUAnjZ9Qs4T5c3CRZTJ+P2DCxY0jIyy
=rRSY
-----END PGP SIGNATURE-----


More information about the Piglit mailing list