[Mesa-users] glGetColorTableParameterivEXT bottleneck in crosscompiled libgl-gdi on windows

Brian Paul brianp at vmware.com
Mon Mar 4 11:20:43 PST 2013


On 03/04/2013 12:59 AM, Christian Gerlach wrote:
> Hi,
>
> I have a 3d viewer on windows which supported opengl software
> rendering as a fallback. As the windows software renderer only support
> OpenGL 1.1, I decided
> to give Mesa a try. Cosscompiling libgl-gdi to create the needed
> opengl32.dll was quite easy and after copying the dll to the
> application folder the viewer started
> without any problems and displayed my test scene.
> The only issue was, that the performance of the viewer was very poor
> in comparison to the opengl software renderer of windows. Just to give
> you some numbers:
>    Windows GDI: 40 fps
>    Mesa: 0.9 fps
> I used vtune to profile the viewer which reports that nearly all the
> time is spend in glGetColorTableParameterivEXT.
>
> Do you have any clue, if I'm doing anything wrong? I'm wondering about
> the use of colortables and I'm very sure, that
> I'm not calling this function from my sourcecode.
>
> Any hint would be appreciated.

glGetColorTableParameterivEXT is a red herring.  I don't know the 
details, but if there's incomplete debug info, debuggers and other 
tools will report glGetColorTableParameterivEXT instead of the correct 
function.  You may need to experiment with compiler flags to get the 
right symbol/debug info.  I guess you're using mingw.  Are you 
compiling with -g?

-Brian



More information about the mesa-users mailing list