[gst-devel] ATI + GLSL + setlocale + linux

Filippo Argiolas filippo.argiolas at gmail.com
Sat Nov 29 22:08:24 CET 2008


On Sat, Nov 29, 2008 at 9:30 PM, Julien Isorce <julien.isorce at gmail.com> wrote:
> Hi Filippo,
>
> Yup and it seems it was present in NVIDIA some years ago, but fixed now.

Yes, I found some MacSlow forum posts about that issue.

> Anyway the bug is strange because even I tried to call setlocale(LC_NUMERIC,
> "C"); before compiling the shader and linking, and then restore by calling
> setlocale(LC_NUMERIC, "");, it still does not work.
> I thinking the problem happens at runtime, and not when parsing shader
> (because no error report in the ati shader compiler log).
> Maybe the compiled shader is not understandable when it was parsed with
> locale "".

Note that locale "" should be your current locale (french)
Unfortunately is not so easy to understand what's happening because
the spec just describes *what* the api should do, not *how* to do it.
So it could that the shader is just validated when you call
glCompileShader and actually compiled when you use it for the first
time.
It would be interesting to see the output of shader compilation. With
nvidia there are a couple of environment variables to see the result
(assembly) of glsl parsing. Is there anything similar with ATI?
Anyway if vec4 (1.0, 0.9, 0.1, 1.0) becomes vec4 (1,0,0,9,0,1,1,0) it
is still valid GLSL code so it would explain why infolog contains no
error.

> If I call setlocale(LC_NUMERIC, "C"); in gstopengl.c, then the sharder works
> :P (Do you remember  the pb about "purple" result ? now I can see the right
> colors with this ATI card)

I always suspected it was some kind of ATI bug :P, I'm glad it's fixed
now, it was getting me crazy (still I believe there is something wrong
with ActiveTexture calls too because I had some issue even with my
nvidian card).

Best Regards,

Filippo Argiolas




More information about the gstreamer-devel mailing list