[Mesa-dev] rationale for GLubyte pointers for strings?

tom fogal tfogal at sci.utah.edu
Tue Jul 19 12:39:23 PDT 2011


I think you have misinterpreted my question.

Why not just have glGetString's prototype be:

  const char* glGetString(GLenum);

? Then (sans the missing const :), your code below would work on *all*
platforms, MIPSpro or not, with or without a cast.

-tom

Patrick Baggett <baggett.patrick at gmail.com> writes:
> SGI invented OpenGL and offered it first on their IRIX platform. SGI's
> MIPSpro compiler has the "char" datatype as unsigned by default, so the
> compiler would likely complain if assigning a GLbyte pointer to an
> [unsigned] character pointer. Thus, to do something like
> 
> char* ext = glGetString(GL_VENDOR);
> 
> doesn't require a cast on IRIX, while the same code would require a cast
> using other compilers due to the aforementioned problem.
> 
> Patrick
> 
> 
> On Tue, Jul 19, 2011 at 1:44 PM, Allen Akin <akin at arden.org> wrote:
> 
> > On Tue, Jul 19, 2011 at 12:20:54PM -0600, tom fogal wrote:
> > | glGetString and gluErrorString, plus maybe some other functions, return
> > | GLubyte pointers instead of simply character pointers...
> > | What's the rationale here?
> >
> > I agree, it's odd.  I don't remember the rationale, but my best guess is
> > that it papered over some compatibility issue with another language
> > binding (probably Fortran).  I suppose there's a very slight possibility
> > that it sprang from a compatibility issue with Cray.
> >
> > Allen


More information about the mesa-dev mailing list