[Piglit] [PATCH 1/3] msaa: Make changes in shared code to accomodate dual-src-blending test cases
Anuj Phogat
anuj.phogat at gmail.com
Wed Aug 15 18:44:46 PDT 2012
[snip]
>> +void
>> +compute_blend_color(float *frag_color, int rect_count,
>> + bool sample_alpha_to_one)
>> +{
>> + float src_blend_factor, dst_blend_factor;
>> + /* Taking in to account alpha values output by
>> + * fragment shader.
>> + */
>> + float src0_alpha = color[rect_count * num_components + 3];
>> + float src1_alpha = 1.0 - src0_alpha / 2.0;
>> +
>> + if(sample_alpha_to_one && num_samples) {
>> + /* Set fragment src0_alpha to 1.0 and use it for
>> + * blending factors.
>> + */
>> + src0_alpha = 1.0;
>
>
> I believe you also need to set src1_alpha = 1.0 here. From the GL spec
> version 4.3, section 17.3.3 (Multisample Fragment Operations):
>
> "Next, if SAMPLE_ALPHA_TO_ONE is enabled, *each* alpha value is
> replaced by the maximum representable alpha value for fixed-point color
> buffers, or by 1.0 for floating-point buffers. Otherwise, the alpha values
> are not changed." (Emphasis mine)
>
> I interpret the word "each" as meaning that both alpha values should be
> set to 1.0 when doing dual source blending.
>
> This change makes the test "alpha-to-one0-dual-src-blend" (from patch 3/3)
> pass on my nVidia system.
>
Yeah. This make sense. If sample-alpha-to-one is enabled we are setting alpha
value of all the draw buffers to 1.0 without dual-src-blend. We should
do the same
with dual-src-blend. nVidia output also confirms your interpretation.
Now the test fails on i965. I'll post a fix for that.
[snip]
>> @@ -396,11 +497,15 @@ compute_expected_color(bool
>> sample_alpha_to_coverage,
>> }
>> else {
>> expected_color[alpha_idx] =
>> - sample_alpha_to_one ? 1.0 :
>> frag_alpha;
>> + (sample_alpha_to_one) ? 1 :
>> frag_alpha;
>> }
>> +
>> + }
>> + if(is_frag_color_alloc) {
>> + free(frag_color);
>> + is_frag_color_alloc = false;
>> }
>
>
>
> The logic surrounding is_frag_color_alloc looks correct, but it's
> difficult to follow. I would recommend just allocating frag_color
> unconditionally at the top of the function, and then in the case where we're
> not doing dual source blending computations, replace "frag_color = color"
> with "memcpy(frag_color, color, ...)"
>
Ok. I'll fix it.
>>
>> void
>> @@ -427,7 +532,6 @@ compute_expected(bool sample_alpha_to_coverage,
>> if (num_samples &&
>> sample_alpha_to_coverage &&
>> !is_buffer_zero_integer_format) {
>> -
>> for (i = 0; i < num_rects; i++) {
>> /* Coverage value for all the draw buffers comes
>> from
>> * the fragment alpha values of draw buffer zero
>> @@ -453,7 +557,6 @@ compute_expected(bool sample_alpha_to_coverage,
>> }
>> else if (buffer_to_test == GL_DEPTH_BUFFER_BIT)
>> compute_expected_depth();
>> -
>> }
>
>
> I don't mind the whitespace changes, but could you please put them in a
> separate patch?
>
ok.
>> /* This function probes all the draw buffers blitted to downsampled FBO
>> @@ -642,15 +745,36 @@ draw_test_image(bool sample_alpha_to_coverage, bool
>> sample_alpha_to_one)
>> pattern_width, pattern_height +
>> y_offset,
>> buffer_to_test, GL_NEAREST);
>>
>> - if(buffer_to_test == GL_COLOR_BUFFER_BIT)
>> - draw_image_to_window_system_fb(i /*
>> draw_buffer_count */,
>> - false /* rhs */);
>> + if(buffer_to_test == GL_COLOR_BUFFER_BIT) {
>> + if(is_dual_src_blending) {
>> + /* Use blitting in place of
>> visualize_image
>> + * function to test blending.
>> + */
>> + glBindFramebuffer(GL_READ_FRAMEBUFFER,
>> + resolve_fbo.handle);
>> + glBindFramebuffer(GL_DRAW_FRAMEBUFFER,
>> 0);
>> + glBlitFramebuffer(0,
>> + y_offset,
>> + pattern_width,
>> + pattern_height +
>> y_offset,
>> + 0,
>> + y_offset,
>> + pattern_width,
>> + pattern_height +
>> y_offset,
>> + buffer_to_test,
>> + GL_NEAREST);
>> + }
>> + else
>> + /* i: draw_buffer_count */
>> + draw_image_to_window_system_fb(i,
>> + false /*
>> rhs */);
>> + }
>
>
> Can you elaborate on why we need to use blitting here (and in the code
> below) in the case of dual source blending? It's not obvious to me from
> looking at the code.
>
I introduced the blitting code for my convenience to compute the expected
color. I later forgot to remove this code. Tests runs fine without the blitting
code. I'll fix it in my follow up patch.
>>
>>
>> /* Expected color values for all the draw buffers are
>> computed
>> * to aid probe_framebuffer_color() and
>> probe_framebuffer_depth()
>> * in verification.
>> */
>> - if(sample_alpha_to_coverage) {
>> + if(sample_alpha_to_coverage || is_dual_src_blending) {
>> /* Expected color is different for different draw
>> * buffers
>> */
>> @@ -713,8 +837,30 @@ draw_reference_image(bool sample_alpha_to_coverage,
>> bool sample_alpha_to_one)
>> pattern_width, pattern_height +
>> y_offset,
>> buffer_to_test, GL_NEAREST);
>>
>> - draw_image_to_window_system_fb(i /* buffer_count */,
>> - true /* rhs */ );
>> + if(buffer_to_test == GL_COLOR_BUFFER_BIT) {
>> + if (is_dual_src_blending) {
>> + /* Use blitting in place of
>> visualize_image
>> + * function to test blending.
>> + */
>> + glBindFramebuffer(GL_READ_FRAMEBUFFER,
>> + resolve_fbo.handle);
>> + glBindFramebuffer(GL_DRAW_FRAMEBUFFER,
>> 0);
>> + glBlitFramebuffer(0,
>> + y_offset,
>> + pattern_width,
>> + pattern_height +
>> y_offset,
>> + pattern_width,
>> + y_offset,
>> + 2 * pattern_width,
>> + pattern_height +
>> y_offset,
>> + buffer_to_test,
>> + GL_NEAREST);
>> + }
>> + else
>> + /* i: draw_buffer_count */
>> + draw_image_to_window_system_fb(i,
>> + true /* rhs */);
>> + }
>> }
>> }
>>
Thanks
Anuj
More information about the Piglit
mailing list