[Mesa-dev] [PATCH 11/18] i965: Check for alpha channel just like in gen6+.
Rafael Antognolli
rafael.antognolli at intel.com
Mon Jun 19 16:09:33 UTC 2017
On Sat, Jun 17, 2017 at 10:38:26AM -0700, Kenneth Graunke wrote:
> On Friday, June 16, 2017 4:31:24 PM PDT Rafael Antognolli wrote:
> > gen6+ uses _mesa_base_format_has_channel() to check for the alpha
> > channel, while gen4-5 use ctx->DrawBuffer->Visual.alphaBits. By using
> > _mesa_base_format_has_channel() here we keep the same behavior accross
> > all gen.
> >
> > While initially both ways of checking the alpha channel seemed correct
> > to me, this change also seems to fix fbo-blending-formats piglit test on
> > gen4.
> >
> > Signed-off-by: Rafael Antognolli <rafael.antognolli at intel.com>
> > ---
> > src/mesa/drivers/dri/i965/brw_cc.c | 21 ++++++++++++---------
> > 1 file changed, 12 insertions(+), 9 deletions(-)
> >
> > diff --git a/src/mesa/drivers/dri/i965/brw_cc.c b/src/mesa/drivers/dri/i965/brw_cc.c
> > index 78d3bc8..cdaa696 100644
> > --- a/src/mesa/drivers/dri/i965/brw_cc.c
> > +++ b/src/mesa/drivers/dri/i965/brw_cc.c
> > @@ -34,6 +34,7 @@
> > #include "brw_state.h"
> > #include "brw_defines.h"
> > #include "brw_util.h"
> > +#include "main/glformats.h"
> > #include "main/macros.h"
> > #include "main/stencil.h"
> > #include "intel_batchbuffer.h"
> > @@ -122,25 +123,27 @@ static void upload_cc_unit(struct brw_context *brw)
> > GLenum srcA = ctx->Color.Blend[0].SrcA;
> > GLenum dstA = ctx->Color.Blend[0].DstA;
> >
> > + if (eqRGB == GL_MIN || eqRGB == GL_MAX) {
> > + srcRGB = dstRGB = GL_ONE;
> > + }
> > +
> > + if (eqA == GL_MIN || eqA == GL_MAX) {
> > + srcA = dstA = GL_ONE;
> > + }
> > +
> > /* If the renderbuffer is XRGB, we have to frob the blend function to
> > * force the destination alpha to 1.0. This means replacing GL_DST_ALPHA
> > * with GL_ONE and GL_ONE_MINUS_DST_ALPHA with GL_ZERO.
> > */
> > - if (ctx->DrawBuffer->Visual.alphaBits == 0) {
> > + const struct gl_renderbuffer *rb = ctx->DrawBuffer->_ColorDrawBuffers[0];
> > + if (rb && !_mesa_base_format_has_channel(rb->_BaseFormat,
> > + GL_TEXTURE_ALPHA_TYPE)) {
>
> Mesa core in framebuffer.c does:
>
> fb->Visual.alphaBits = _mesa_get_format_bits(fmt, GL_ALPHA_BITS);
>
> which uses the actual Mesa format we chose for the buffer. In contrast,
> _mesa_base_format_has_channel() looks at the GL format enums - the format
> they actually requested. In other words, they might have asked for GL_RGB,
> but we promoted them to MESA_FORMAT_RGBA*.
>
> I think this is a good change.
>
> > srcRGB = brw_fix_xRGB_alpha(srcRGB);
> > srcA = brw_fix_xRGB_alpha(srcA);
> > dstRGB = brw_fix_xRGB_alpha(dstRGB);
> > dstA = brw_fix_xRGB_alpha(dstA);
> > }
> >
> > - if (eqRGB == GL_MIN || eqRGB == GL_MAX) {
> > - srcRGB = dstRGB = GL_ONE;
> > - }
> > -
> > - if (eqA == GL_MIN || eqA == GL_MAX) {
> > - srcA = dstA = GL_ONE;
> > - }
> > -
>
> Why are these moved? It seems harmless, but unrelated to what the
> patch claims to do. Is it just for consistency across the code?
> If the two changes were separate patches, they would get my:
Yeah, just for consistency. OK, I'll split them into separate patches.
> Reviewed-by: Kenneth Graunke <kenneth at whitecape.org>
>
> > cc->cc6.dest_blend_factor = brw_translate_blend_factor(dstRGB);
> > cc->cc6.src_blend_factor = brw_translate_blend_factor(srcRGB);
> > cc->cc6.blend_function = brw_translate_blend_equation(eqRGB);
> >
>
More information about the mesa-dev
mailing list