[Piglit] [PATCH 3/4] arb_texture_view: Pass correct layer index to shader
Chris Forbes
chrisf at ijw.co.nz
Tue Jan 14 11:42:38 PST 2014
I decided to go back to the spec and verify some of my assumptions --
turns out I was wrong about undefined results from out of range layer
indexes -- I had got the rules for sampling array textures confused
with the rules for layered framebuffer attachments.
>From the GL4 core profile spec, 3.8 Texturing, Coordinate Wrapping and
Texel Selection, on p212 [228 of the PDF]:
For one- and two-dimensional array textures, the texel is obtained
from image layer l, where
l = clamp(floor(t + 0.5), 0, h_t-1), for one-dimensional array textures,
clamp(floor(r + 0.5), 0, d_t-1), for two-dimensional array textures.
So your test should work, but relies on this.
I'll drop this patch, but also add some additional cases which try to
sample from a layer within the view, rather than always exercising
clamping.
-- Chris
On Wed, Jan 15, 2014 at 8:06 AM, Chris Forbes <chrisf at ijw.co.nz> wrote:
> Could do that, but one side or the other *must* account for the view
> having a nonzero MinLayer.
>
> Colors[] is indexed based on the layer index relative to the
> underlying texture; expectedLayer is currently relative to the
> underlying texture; the value used in the shader needs to be relative
> to the view [so expectedLayer - view's MinLayer]
>
> Did this actually work on nvidia on master?
>
>
> On Wed, Jan 15, 2014 at 5:36 AM, Jon Ashburn <jon at lunarg.com> wrote:
>> see inline
>>
>> On 01/11/2014 05:39 PM, Chris Forbes wrote:
>>>
>>> We want to sample from the last layer in the view, but we were actually
>>> trying to sample from a layer beyond that [by the texture's minimum
>>> layer].
>>>
>>> This might have worked on hardware where the sampler clamps the layer
>>> index before sampling [NV?], but that relies on undefined behavior.
>>>
>>> Signed-off-by: Chris Forbes <chrisf at ijw.co.nz>
>>> ---
>>> tests/spec/arb_texture_view/rendering_levels.c | 2 +-
>>> 1 file changed, 1 insertion(+), 1 deletion(-)
>>>
>>> diff --git a/tests/spec/arb_texture_view/rendering_levels.c
>>> b/tests/spec/arb_texture_view/rendering_levels.c
>>> index ef29f78..86bf055 100644
>>> --- a/tests/spec/arb_texture_view/rendering_levels.c
>>> +++ b/tests/spec/arb_texture_view/rendering_levels.c
>>> @@ -205,7 +205,7 @@ test_render_layers(void)
>>> glClear(GL_COLOR_BUFFER_BIT);
>>> expectedLayer = l + numLayers[l] - 1;
>>> - draw_3d_depth(-1.0, -1.0, 2.0, 2.0, expectedLayer);
>>> + draw_3d_depth(-1.0, -1.0, 2.0, 2.0, numLayers[l] - 1);
>>>
>>
>> with this change then the draw depth is inconsistent with the indexing into
>> the Colors array
>> how about this:
>> expectedLayer = numLayers[l] - 1;
>>
>>
>>> expected[0] = Colors[expectedLayer][0] / 255.0;
>>> expected[1] = Colors[expectedLayer][1] / 255.0;
>>
>>
>> _______________________________________________
>> Piglit mailing list
>> Piglit at lists.freedesktop.org
>> http://lists.freedesktop.org/mailman/listinfo/piglit
More information about the Piglit
mailing list