[Piglit] Update some docstrings v2
Daniel Vetter
daniel at ffwll.ch
Wed Mar 19 10:08:11 PDT 2014
On Sun, Mar 16, 2014 at 01:55:02PM -0700, Dylan Baker wrote:
> On Sunday, March 16, 2014 20:30:15 Daniel Vetter wrote:
> > On Sat, Mar 15, 2014 at 07:39:45PM -0700, Dylan Baker wrote:
> > > On Saturday, March 15, 2014 08:41:15 AM Ilia Mirkin wrote:
> > > > On Sat, Mar 15, 2014 at 8:29 AM, Daniel Vetter <daniel at ffwll.ch>
> > >
> > > wrote:
> > > > > On Fri, Mar 14, 2014 at 07:41:04PM -0700, Dylan Baker wrote:
> > > > >> [snip]
> > > > >>
> > > > >> > > I'll throw a patch at the end of the series, do you want me to
> > >
> > > send
> > >
> > > > >> I'm gonna take it back, sorry. I don't know that dmesg-warn
> > >
> > > should be
> > >
> > > > >> worse than warn, (same for fail) since pass -> dmesg-warn, warn
> > >
> > > ->
> > >
> > > > >> dmesg-fail, and fail -> dmesg-fail. Personally I was never a fan of
> > > > >> having
> > > > >> special dmesg- statuses, I feel that a fail is a fail and warn is a
> > >
> > > warn,
> > >
> > > > >> but I'm not sure that change is correct.
> > > > >
> > > > > The current ordering seems wrong to me, e.g. if you have a failing
> > >
> > > tests
> > >
> > > > > and fix up some dmesg noise you now have a regression.
> > > >
> > > > And if you add dmesg noise, you have a fix :) printk(), here I come!
> > > >
> > > > On a mildly note, am I the only one who thinks it's weird that
> > > > transitions to/from (skip, notrun) are considered fixes/regressions?
> > > >
> > > > -ilia
> > >
> > > I agree, that was changed be someone from my original
> > > implementation, but obviously it was changed so at least one person
> > > feels the current behavior is correct.
> >
> > As mentioned such transitions make sense for the kernel where we never
> > break abi or disable old features (well, until the last user/hw
> > disappeared at least). Hence a fail->skip is a regression (probably the
> > kernel broke a feature flag) and fail->notrun is a regression (probably
> > the testcase is broken and dropped a subtest somehow).
> >
> > fail->notrun has a bit a downside when doing a massive testcase renaming
> > for better consistency, but thus far we've only had one case where we've
> > done a bit of large-scale renaming in the last two years.
> > -Daniel
>
> I don't want to beat a dead horse, so here's the argument from those of us who
> don't like the current behavior, and then I'll let it be:
>
> The big problem is that if your workflow is to do a baseline run, and then do
> smaller focused runs of "Oh, I regressed tests A, B, and C; just run those
> till I fix them, then run the whole thing again" then using the
> regressions/fixes pages are impossible, they're just full of "Not Run" and
> "Skips". I know that this is a pretty common workflow for hardware bring up.
>
> I'm working on some patches to add pages to summary (Skip||NotRun -> Any) and
> the converse and remove them as fixes/regressions. I'm hoping to send them out
> tomorrow, and that they will provide useful data for those who really want to
> see notrun/skip changes, and allow those who run subsets of the test suite
> still get useful information from the fixes/regressions pages.
I think having separate skip/notrun changes pages is useful, and a
suitable comprise for shuffling skip around all the time. So count me in
on this.
-Daniel
--
Daniel Vetter
Software Engineer, Intel Corporation
+41 (0) 79 365 57 48 - http://blog.ffwll.ch
More information about the Piglit
mailing list