[Mesa-dev] [PATCH 5/7] mesa: add support for AMD_blend_minmax_factor
Roland Scheidegger
sroland at vmware.com
Fri Jan 3 08:51:21 PST 2014
Am 03.01.2014 16:22, schrieb Roland Scheidegger:
> Am 03.01.2014 02:18, schrieb Maxence Le Doré:
>> ---
>> src/mesa/main/blend.c | 3 +++
>> src/mesa/main/extensions.c | 1 +
>> src/mesa/main/mtypes.h | 1 +
>> 3 files changed, 5 insertions(+)
>>
>> diff --git a/src/mesa/main/blend.c b/src/mesa/main/blend.c
>> index 9e11ca7..4995143 100644
>> --- a/src/mesa/main/blend.c
>> +++ b/src/mesa/main/blend.c
>> @@ -326,6 +326,9 @@ legal_blend_equation(const struct gl_context *ctx, GLenum mode)
>> case GL_MIN:
>> case GL_MAX:
>> return ctx->Extensions.EXT_blend_minmax;
>> + case GL_FACTOR_MIN_AMD:
>> + case GL_FACTOR_MAX_AMD:
>> + return ctx->Extensions.AMD_blend_minmax_factor;
>> default:
>> return GL_FALSE;
>> }
>> diff --git a/src/mesa/main/extensions.c b/src/mesa/main/extensions.c
>> index f0e1858..b46c788 100644
>> --- a/src/mesa/main/extensions.c
>> +++ b/src/mesa/main/extensions.c
>> @@ -299,6 +299,7 @@ static const struct extension extension_table[] = {
>>
>> /* Vendor extensions */
>> { "GL_3DFX_texture_compression_FXT1", o(TDFX_texture_compression_FXT1), GL, 1999 },
>> + { "GL_AMD_blend_minmax_factor", o(AMD_blend_minmax_factor), GL, 2009 },
>> { "GL_AMD_conservative_depth", o(ARB_conservative_depth), GL, 2009 },
>> { "GL_AMD_draw_buffers_blend", o(ARB_draw_buffers_blend), GL, 2009 },
>> { "GL_AMD_performance_monitor", o(AMD_performance_monitor), GL, 2007 },
>> diff --git a/src/mesa/main/mtypes.h b/src/mesa/main/mtypes.h
>> index f93bb56..4081e4e 100644
>> --- a/src/mesa/main/mtypes.h
>> +++ b/src/mesa/main/mtypes.h
>> @@ -3433,6 +3433,7 @@ struct gl_extensions
>> GLboolean EXT_vertex_array_bgra;
>> GLboolean OES_standard_derivatives;
>> /* vendor extensions */
>> + GLboolean AMD_blend_minmax_factor;
>> GLboolean AMD_performance_monitor;
>> GLboolean AMD_seamless_cubemap_per_texture;
>> GLboolean AMD_vertex_shader_layer;
>>
>
> Where did you get the 2009 year from? The earliest I can find is 2010.
> Also, I think it would be nice if there'd be some test (piglit) for this.
> And could this be enabled for gallium drivers? Right now the state
> tracker translates away the blend factors for min/max as the gallium
> interface already could handle this extension without any effort. That
> said, I'm not sure if all drivers can handle it (nvidia in particular),
> since afair d3d (9 and 10) also require blend factors to be ignored
> hence it is indeed possible not everyone can do it. In this case a cap
> bit would be required.
Oh sorry this didn't really make much sense the way it's worded, as you
enabled it for gallium drivers (missed this due to an aggressive spam
filter...).
I've got another minor complaint for 6/7 as the commit summary is wrong
(ARB_blend_minmax_factor).
Roland
More information about the mesa-dev
mailing list