[Mesa-dev] Mesa (master): mesa: Validate assembly shaders when GLSL shaders are used
Ian Romanick
idr at freedesktop.org
Thu Oct 14 16:51:01 PDT 2010
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
tom fogal wrote:
> Ian Romanick <idr at freedesktop.org> writes:
>> Brian Paul wrote:
>>> On 10/12/2010 07:49 PM, Ian Romanick wrote:
>>>> Brian Paul wrote:
>>>>>> -
>>>>>> +#include<stdbool.h>
>>>>> Ian, could we stick with GLboolean/GL_TRUE/GL_FALSE in core Mesa to be
>>>>> consistent?
>>>> If possible, I'd prefer not. [. . .]
>>> My concern here is people will get lazy and just use int
>>> everywhere. Using GL types such as GLuint, GLint, GLboolean and
>>> GLbitfield convey useful information about the variables declared
>>> with those types.
>> There are a few GL types that actually have useful semantic
>> information. The others were just ways to request types of a
>> specific size (e.g., GLint must be a 32-bit integer). With
>> inttypes.h and stdbool.h those types no longer serve any purpose.
>> They just make code that uses OpenGL look different from code that
>> doesn't use OpenGL.
>>
>>> For example, if I see "int buffers_used" I have to wonder if
>>> it's really a boolean or a bitfield or a count (and what if it's
>>> negative?). I don't have that problem if the GL types are used.
>> GLint vs. GLuint doesn't provide any advantage over int vs. unsigned.
>> Looking at the number of 'comparison between signed and unsigned
>> integer expressions' warnings (over 600 in my build) in Mesa, it's
>> pretty clear that we're doing it very wrong anyway.
>
> One thing that bytes (ha!) us now and again is GL's broken "sizei"
> type, which is *signed*. Ugh.
Oh man... don't remind me. :(
> As an app developer, GL types are dumb. They don't provide any
> advantages, and they add the additional step of casting your data
> before you send it to the GL. Nobody is going to standardize on
> 'GLfloat', for example, because your I/O code being cognizant of
> graphics is stupid (not to mention the desired abstraction for a DX
> backend).
It gets even worse if you use OpenGL with SDL (which defines its own
sized types) or any other Khronos API (which define their own sized
types). It's all vain attempts at solving a solved problem. :(
> I guess this is technically broken -- "... for example, GL type int is
> referred to as GLint outside this document, *and is not necessarily
> equivalent to the C type int*." (emph. added, from gl3.1 spec, page
> 16). That said, if they didn't line up I bet so many apps would break
> that GL driver authors don't really have a choice, but that's another
> story...
This is because C89 specifies that int is "at least" 16-bits, and there
was no standard way in C89 to request exactly 32-bits. That's fixed
now, as you point out below.
> Back to my main thread: I wish these types would just die, and GL would
> switch to the semi-std int8_t, uint32_t, etc. types. I'd vote the GL
> types exist only at interfaces, for what my vote's worth.
>
> (That said, there are a few types: bitfield, half, clampf, and clampd,
> which convey unique information and might be nice to keep around.)
My sentiments exactly.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEYEARECAAYFAky3l2MACgkQX1gOwKyEAw9csQCeJMExxvgNWz8uRRnpG1BeZ+Td
qeUAnjg4zPRGRZHdzQWAgH0VNYMsQ5oY
=J4tp
-----END PGP SIGNATURE-----
More information about the mesa-dev
mailing list