[Mesa-dev] [PATCH] mesa: Fix GL_LUMINANCE handling for compressed textures in glGetTexImage
Anuj Phogat
anuj.phogat at gmail.com
Fri Nov 16 17:24:47 PST 2012
On Fri, Nov 16, 2012 at 4:15 PM, Brian Paul <brianp at vmware.com> wrote:
> On 11/16/2012 05:06 PM, Anuj Phogat wrote:
>>
>> On Fri, Nov 16, 2012 at 3:28 PM, Ian Romanick<idr at freedesktop.org> wrote:
>>>
>>> On 11/16/2012 01:21 PM, Anuj Phogat wrote:
>>>>
>>>>
>>>> We need to rebase colors (ex: set G=B=0) when getting GL_LUMINANCE
>>>> textures in following cases:
>>>> 1. If the luminance texture is actually stored as rgba
>>>> 2. If getting a luminance texture, but returning rgba
>>>> 3. If getting an rgba texture, but returning luminance
>>>>
>>>> A similar fix was pushed by Brian Paul for uncompressed textures
>>>> in commit: f5d0ced.
>>>>
>>>> Fixes intel oglconform pxconv-gettex test:
>>>> https://bugs.freedesktop.org/show_bug.cgi?id=47220
>>>>
>>>> Observed no failures in piglit and ogles2conform due to this fix.
>>>> This patch will cause failures in intel oglconform pxstore-gettex
>>>> and pxtrans-gettex test cases. The cause of a failures is a bug
>>>> in test cases. Expected luminance value is calculted incorrectly:
>>>
>>>
>>> calculated
>>>
>>>> L = R+G+B.
>>>
>>>
>>>
>>> This seems weird. We fail pxconv-gettex because we convert the luminance
>>> texture supplied by the application to RGB by replicating the L value to
>>> each of R, G, and B. In this case, and this case only, we need to read
>>> back
>>> luminance by just returning R. If the application actually specified an
>>> RGB
>>> texture, we need to read back luminance by R+G+B. Right?
>>>
>> If a texture is stored in RGB/RGBA internal format and we later read it
>> back
>> using glGetTexImage. There is no way to find out the format of initial
>> texture
>> data supplied by application later in the driver.
>
>
> Mmm, you should be able to look at gl_texture_image::InternalFormat to see
> what the user requested. And then check for luminance with
> _mesa_base_tex_format(texImage->InternalFormat)==GL_LUMINANCE.
>
Brian, this is what I was trying to say:
Case1
glTexImage2D(internalformat=GL_RGBA, format=GL_RGBA);
glGetTexImage(GL_LUMINANCE);
Case2:
glTexImage2D(internalformat=GL_RGBA, format=GL_LUMINANCE);
glGetTexImage(GL_LUMINANCE);
In both cases glGetTexImage() will be able to see the requested texture
information as:
texImage->InternalFormat= GL_RGBA
texImage->_BaseFormat = GL_RGBA.
texImage->texFormat = MESA_FORMAT_RGBA8888
Here, I don't see a way to query the format of texture data initially uploaded
by application in glTexImage2D. i.e GL_RGBA in case 1 and GL_LUMINANCE
in case 2.
I said this to explain why we always return L = Tex(R) while getting a RGBA
texture but returning LUMINANCE irrespective of texture data format initially
uploaded by glTexImage2D.
Ian, there is a piglit test case test_rgba() in
getteximage-luminance.c which tests
the exact case you described for uncompressed textures. It expects L = tex(R).
>
>> For glGetTexImage we
>> return L = Tex(R). This is what we do in mesa for uncompressed textures.
>> it's different from glReadPixels which returns L=R+G+B.
>>
>>> Does this match what pxconv-gettex and pxconv-trans expect?
>>>
>>> This L vs. RGB crap confuses me every time...
>>>
>> Yeah this is confusing. pxstore-gettex and pxtrans-gettex expects
>> L=R+G+B when getting an rgba texture, but returning luminance.
>> Sometime back I submitted a fix to change the expected values
>> in pxconv-gettex to L = tex(R). Similar fix is required for failing two
>> test cases.
>
>
> -Brian
More information about the mesa-dev
mailing list