[Mesa-dev] [PATCH] mesa: add GL_UNSIGNED_INT_24_8 to _mesa_pack_depth_span

Tapani Pälli tapani.palli at intel.com
Thu Oct 1 09:50:26 PDT 2015


On 10/01/2015 03:17 PM, Iago Toral wrote:
> On Thu, 2015-10-01 at 08:28 +0300, Tapani Pälli wrote:
>> Patch adds missing type (used with NV_read_depth) so that it gets
>> handled correctly. Also add type to _mesa_problem output to aid
>> debugging.
>>
>> Signed-off-by: Tapani Pälli <tapani.palli at intel.com>
>> ---
>>   src/mesa/main/pack.c | 4 +++-
>>   1 file changed, 3 insertions(+), 1 deletion(-)
>>
>> diff --git a/src/mesa/main/pack.c b/src/mesa/main/pack.c
>> index 7147fd6..54a0c42 100644
>> --- a/src/mesa/main/pack.c
>> +++ b/src/mesa/main/pack.c
>> @@ -1074,6 +1074,7 @@ _mesa_pack_depth_span( struct gl_context *ctx, GLuint n, GLvoid *dest,
>>            }
>>         }
>>         break;
>> +   case GL_UNSIGNED_INT_24_8:
> Is it okay to store 32-bit integers in this case? that's what the code
> below does. The spec says that the 8 stencil bits are undefined, but
> don't we need to convert the depth value to a 24-bit integer scale?
> (i.e. make 1.0 translate to 2^24-1 not 2^32-1).

Oh, I thought stencil was included, I'll check if there's any existing 
app/test that would actually use this.

> Iago
>
>>      case GL_UNSIGNED_INT:
>>         {
>>            GLuint *dst = (GLuint *) dest;
>> @@ -1124,7 +1125,8 @@ _mesa_pack_depth_span( struct gl_context *ctx, GLuint n, GLvoid *dest,
>>         }
>>         break;
>>      default:
>> -      _mesa_problem(ctx, "bad type in _mesa_pack_depth_span");
>> +      _mesa_problem(ctx, "bad type in _mesa_pack_depth_span (%s)",
>> +                    _mesa_enum_to_string(dstType));
>>      }
>>   
>>      free(depthCopy);
>



More information about the mesa-dev mailing list