[Piglit] [PATCH 00/40] Replace vpfp-generic with shader_runner.

Kenneth Graunke kenneth at whitecape.org
Sun Jun 7 23:01:56 PDT 2015


On Monday, June 08, 2015 03:42:36 PM Dave Airlie wrote:
> On 8 June 2015 at 15:31, Kenneth Graunke <kenneth at whitecape.org> wrote:
> > On Monday, June 08, 2015 06:29:26 AM Dave Airlie wrote:
> >> On 7 June 2015 at 12:11, Kenneth Graunke <kenneth at whitecape.org> wrote:
> >> > Hi all,
> >> >
> >> > This patch series ports all vpfp-generic tests to shader_runner,
> >> > and then deletes vpfp-generic.
> >> >
> >> > A bit of history:
> >> > - vpfp-generic was introduced by Nicolai Hähnle in 2009, as a generic
> >> >   ARB_vertex|fragment_program test runner.
> >> > - shader_runner was introduced by Ian Romanick in 2010, as a generic
> >> >   GLSL shader runner.
> >> > - shader_runner gained ARB program support in 2011 (courtesy of Eric Anholt).
> >> >
> >> > At this point, vpfp-generic is fairly redundant - shader_runner can do
> >> > everything we need, and is much more widespread (12000+ tests).  I've
> >> > been meaning to delete it for a few years, but never got around to it.
> >> >
> >> > One difference is that the new tests don't glClear() before drawing.  Since
> >> > they draw the entire window, it's pretty unnecessary, and just makes the
> >> > tests harder to debug.  Many shader_runner tests don't bother clearing.
> >>
> >> This is actually annoying feature, esp if all tests use the same color
> >> for success,
> >>
> >> because we render one test, it passes, we render another test, it
> >> doesn't draw anything
> >> but it has gotten the back buffer from the previous tests, and it
> >> magically passes.
> >>
> >> This happens a lot more often on GPUs with VRAM.
> >>
> >> Dave.
> >
> > I don't know...the tests probe the entire window...so the only failure
> > mode that will bite you like that is "the driver didn't render anything
> > at all."  And the assumption is that, even with such a broken driver,
> > clearing will actually succeed at drawing...
> 
> Yes but what happens if all the tests run and don't bother clearing,
> so drawing fails in test 1, test 2 passes because it doesn't clear,
> and it gets test1 result frame, where it passed, It looks like
> test2 passes when it clearly hasn't.
> 
> You've actually said it, clearing would succeed, but the problem
> is the tests don't clear.
> 
> and yes there are many reasons things don't render, the main
> one I see if where an earlier test has locked up the GPU but
> not totally.
> 
> Dave.

So your thinking is that Test 2 exercises some functionality (i.e.
complex geometry shaders) which hang the GPU, but the clear is simple,
so it's likely to succeed, and is done first?  I suppose that makes
some sense.

Still, if your GPU is locking up, then it seems like tracking that down
(and either fixing or blacklisting those tests) ought to be your top
priority.  Finding the tests that hang is pretty straightforward:

$ piglit run --dmesg -1 -v quick results/hangs

I always assume Piglit results are invalid (or dubious at best) if a run
has hung the GPU.

We should probably check dmesg at least once, even on a normal run, and
log a message indicating that there was a GPU hang.  It wouldn't tell
you which hung, but you'd know your results were suspect and to re-run
with --dmesg to pinpoint it...

At any rate, are you asking me to rework my patches to preserve the
existing color clear?  There are a lot of Piglit tests done both ways.

--Ken
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: This is a digitally signed message part.
URL: <http://lists.freedesktop.org/archives/piglit/attachments/20150607/b2909f14/attachment.sig>


More information about the Piglit mailing list