libGL exported __glX* symbols (Was: glucose and xgl progress)
jrfonseca at tungstengraphics.com
Wed Sep 26 12:00:27 PDT 2007
On 9/26/07, Kristian Høgsberg <krh at bitplanet.net> wrote:
> On 9/20/07, José Fonseca <jrfonseca at tungstengraphics.com> wrote:
> > On 9/19/07, Gabor Gombas <gombasg at sztaki.hu> wrote:
> > > On Wed, Sep 19, 2007 at 10:20:49AM +0000, José Fonseca wrote:
> > > > However, when loaded, references to __glXFreeContext *inside*
> > > > libglxgext.so are linked to the external __glXFreeContext in libGL.so:
> > >
> > > If you have multiple definitions for a symbol it is completely random
> > > which a given reference will resolve to.
> > >
> > > Now, the two underscores are a good hint that these are internal symbols
> > > and they should not be exported at all or if they have to, one of them
> > > must be renamed.
> > >
> > > > libtool's -export-dynamic flag is not being used. Using libtool's -module
> > > > flag doesn't change anything.
> > >
> > > Does this symbol have to be exported? If no, you should use libtool's
> > > --export-symbol feature to explicitely declare which symbols should be
> > > visible and which should not. In fact, it is always wise to use
> > > --export-symbol when creating shared libraries to prevent ABI breakage
> > > by accidentally exporting private symbols.
> > >
> > > If __glXFreeContext should be exported, then it should be decided which
> > > library owns this symbol, and the other must be modified not to export
> > > it.
> > >
> > > If it is the case that libGL.so exports __glXFreeContext but
> > > libglxgext.so wants to locally override it, and for some reason you
> > > absolutely cannot rename it, then you must use gcc's
> > > __visibility__((__protected__)) attribute when declaring __glXFreeContext
> > > in libglgxext.so, but that is not portable to non-ELF platforms and
> > > other compilers and also has run-time performance costs IIRC.
> > I agree in principle with your suggestions. But I'm not entitled to
> > decide if __glX* symbols
> > should or not be exported, nor to say what's the best way to accomplish it.
> > I know that __glX* functions came originally from SGI code. From there
> > derived copies appear on mesa (for libGL), and more than once in
> > xserver code -- I suppose always for indirect rendering purposes (in
> > places such as AIGLX, DMX, and Xgl). It is likely that other vendors
> > also ship libGL exporting those symbols.
> > But it would definitely make things simpler and less likely to break
> > if the __glX* symbols were not exported...
> Ok, big disclaimer here: I haven't looked into glucose or even glitz
> much, so what I'm suggesting here may not apply. But my take is that
> we shouldn't be loading libGL in the X server to begin with. If we
> want to use opengl in the X server we should call into the dispatch
> table directly (as for example the __glXDRIbindTexImage implementation
> in GL/glx/glxdri.c:
> CALL_GetIntegerv(GET_DISPATCH(), (glxPixmap->target == GL_TEXTURE_2D ?
> GL_TEXTURE_BINDING_2D :
> and we may have to export some of the GLX code for use inside the X
> server (creating glx drawables and contexts etc) and separate out the
> protocol stuff. Basically if we have to hack around linking issues
> and fudge symbol resolution issues we're doing something wrong.
The thread's subject is a bit misleading, but this problem actually
only happens on Xgl (which is supposed to link to the system
/usr/lib/libGL which may even have nothing to do with the DRI, but it
get's duplicates when it does).
glucose reuses some of the Xgl code, but it bypasses libGL, and uses
AIGLX as you are suggesting.
More information about the xorg