[Mesa-dev] [PATCH v3] i965: Fix ETC2/EAC GetCompressed* functions on Gen7 GPUs

Alejandro Piñeiro apinheiro at igalia.com
Tue May 8 06:48:28 UTC 2018



On 07/05/18 14:45, Eero Tamminen wrote:
> Hi,
>
> On 04.05.2018 22:06, Eleni Maria Stea wrote:
>> On Fri, 4 May 2018 18:29:55 +0300
>> Eero Tamminen <eero.t.tamminen at intel.com> wrote:
>>
>>> You mean returning CAVEAT_SUPPORT in params for compressed formats
>>> which are transparently converted to uncompressed data?
>>
>> Well, that would be the best solution I think, if it's possible to
>> modify an existing query in the extension, although I am not certain
>> which is the best query to modify: TEXTURE_COMPRESSED, or
>> INTERNALFORMAT_SUPPORTED (or maybe both?).
>>
>> There's also another solution that we already have, but we are not sure
>> if it is correct:
>>
>> I noticed that both mesa and nvidia drivers return GL_FALSE when the
>> pname is GL_TEXTURE_COMPRESSED and the format is emulated and GL_TRUE
>> for the natively supported formats.
>
> So, in Nvidia case only the ancient GLES 1.x GL_PALETTE_* formats
> are emulated.  I guess Nvidia HW was never than HSW. :-)
>
>
>> (Specifically on mesa the code that
>> performs the check is in src/mesa/main/formatquery.c and tests only
>> for native support).
> >
>> But if you take a look at this part of the extension specification:
>>
>> TEXTURE_COMPRESSED: If <internalformat> is a compressed format
>>        that is supported for this type of resource, TRUE is returned in
>>        <params>. If the internal format is not compressed, or the
>> type of
>>        resource is not supported, FALSE is returned.
>>
>> it is not very clear if we should return true or false for an t
>> emulated format. Maybe returning false when we provide emulation is a
>> bug in both drivers, just a convenient one in this case. :-)
>>
>> Is there any way to clarify what should be the correct behavior?
>
> Khronos documentation bug tracker?

Sorry for not mentioning it here: I already opened a ticket on Khronos
issue tracker to try to clarify this. Ideally (at least imho), we would
like TEXTURE_COMPRESSED to be able to return CAVEAT_SUPPORT, instead of
just TRUE/FALSE as it is defined right now.

>
> If you can test also e.g. AMD driver, that would be good background
> data for the clarification ticket.
>
>
>> Do you think that even if the current behavior of the
>> TEXTURE_COMPRESSED query is correct, in which case it should keep
>> returning GL_FALSE for the emulated formats, we should nevertheless
>> modify something else, e.g. the INTERNALFORMAT_SUPPORTED query, to
>> return CAVEAT_SUPPORT?
>
> Uh, I'm not the best person to answer that.  Maybe somebody
> on Portland team who has contacts to game engine developers?
>
>
>>> That API's not available for GLES v2, where I think ETC is most widely
>>> used, so it would be more of a solution for GLES v3.x applications
>>> only. Sounds OK to me.
>>>
>>> Hardest part will be propagating use of this query to engines &
>>> toolkits that would benefit from using it. :-)
>>
>> +1 on that :)
>>
>> Thanks a lot for the suggestions and the feedback,
>> Eleni
>>
>> PS: here is some code to clarify the current situation:
>>
>> [1]: https://github.com/hikiko/test-compression is a test program to
>> quickly check the compressed formats supported (see
>> the function print_compressed_formats at the end of main.c)
>>
>> [2]: https://pastebin.com/Qif74fFn is the output of [1] on HSW using
>> the ETC patch and on nvidia where you can see that the natively
>> supported compression formats return GL_TRUE in both cards whereas the
>> emulated ones return GL_FALSE in both cards
>
> Thanks for checking these!
>
>
>     - Eero
> _______________________________________________
> mesa-dev mailing list
> mesa-dev at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/mesa-dev



More information about the mesa-dev mailing list