[Mesa-dev] [PATCH][RFC] dri3: Add support for the GLX_EXT_buffer_age extension
Adel Gadllah
adel.gadllah at gmail.com
Thu Feb 20 01:42:32 PST 2014
Am 20.02.2014 02:59, schrieb Ian Romanick:
> On 02/19/2014 02:49 PM, Adel Gadllah wrote:
>> Hi,
>>
>> The attached patch adds support for the GLX_EXT_buffer_age extension,
>> which is mostly used by compositors for efficient sub screen updates.
>>
>> The extension should not be reported as supported when running DRI2 but
>> it seems to show up when I try to disable it with LIBGL_DRI3_DISABLE ...
>> not sure why suggestions welcome.
>>
>>
>> P.S: Please CC me when replying as I am not subscribed to the list.
> You'll need to fix that. :)
Done.
> You didn't send this patch with git-send-email. Whatever you used to
> send it also mangled it, so it won't apply.
>
Yeah sorry for that git send-mail complained about the SSL cert so I just used
Thunderbird .. which didn't work well. I have fixed the SSL issue so next version
will be sent using git send-mail.
>> + * GLX_EXT_buffer_age
>> + */
>> +#define GLX_BACK_BUFFER_AGE_EXT 0x20F4
>> +
>>
>> typedef struct __GLXcontextRec *GLXContext;
>> typedef XID GLXPixmap;
>> diff --git a/include/GL/glxext.h b/include/GL/glxext.h
>> index 8c642f3..36e92dc 100644
>> --- a/include/GL/glxext.h
>> +++ b/include/GL/glxext.h
>> @@ -383,6 +383,11 @@ void glXReleaseTexImageEXT (Display *dpy,
>> GLXDrawable drawable, int buffer);
>> #define GLX_FLIP_COMPLETE_INTEL 0x8182
>> #endif /* GLX_INTEL_swap_event */
>>
>> +#ifndef GLX_EXT_buffer_age
>> +#define GLX_EXT_buffer_age 1
>> +#define GLX_BACK_BUFFER_AGE_EXT 0x20F4
>> +#endif /* GLX_EXT_buffer_age */
>> +
> We get glxext.h directly from Khronos, so it should not be modified...
Oh OK.
> except to import new versions from upstream. It looks like the upstream
> glxext.h has this, so the first patch in the series should be "glx:
> Update glxext.h to revision 25407." And drop the change to glx.h.
Actually the version we ship seems to have it already. But probably worth updating anyway.
>> #ifndef GLX_MESA_agp_offset
>> #define GLX_MESA_agp_offset 1
>> typedef unsigned int ( *PFNGLXGETAGPOFFSETMESAPROC) (const void *pointer);
>> diff --git a/src/glx/dri2_glx.c b/src/glx/dri2_glx.c
>> index 67fe9c1..007f449 100644
>> --- a/src/glx/dri2_glx.c
>> +++ b/src/glx/dri2_glx.c
>> @@ -1288,6 +1288,7 @@ dri2CreateScreen(int screen, struct glx_display *
>> priv)
>> psp->waitForSBC = NULL;
>> psp->setSwapInterval = NULL;
>> psp->getSwapInterval = NULL;
>> + psp->queryBufferAge = NULL;
>>
>> if (pdp->driMinor >= 2) {
>> psp->getDrawableMSC = dri2DrawableGetMSC;
>> diff --git a/src/glx/dri3_glx.c b/src/glx/dri3_glx.c
>> index 70ec057..07120e1 100644
>> --- a/src/glx/dri3_glx.c
>> +++ b/src/glx/dri3_glx.c
>> @@ -1345,6 +1345,8 @@ dri3_swap_buffers(__GLXDRIdrawable *pdraw, int64_t
>> target_msc, int64_t divisor,
>> target_msc = priv->msc + priv->swap_interval * (priv->send_sbc
>> - priv->recv_sbc);
>>
>> priv->buffers[buf_id]->busy = 1;
>> + priv->buffers[buf_id]->last_swap = priv->swap_count;
>> +
>> xcb_present_pixmap(c,
>> priv->base.xDrawable,
>> priv->buffers[buf_id]->pixmap,
>> @@ -1379,11 +1381,23 @@ dri3_swap_buffers(__GLXDRIdrawable *pdraw,
>> int64_t target_msc, int64_t divisor,
>> xcb_flush(c);
>> if (priv->stamp)
>> ++(*priv->stamp);
>> +
>> + priv->swap_count++;
>> }
>>
>> return ret;
>> }
>>
>> +static int
>> +dri3_query_buffer_age(__GLXDRIdrawable *pdraw)
>> +{
>> + struct dri3_drawable *priv = (struct dri3_drawable *) pdraw;
>> + int buf_id = DRI3_BACK_ID(priv->cur_back);
> Blank line here.
>
> Also maybe use dri3_back_buffer instead?
>
> const struct dri3_buffer *const back = dri3_back_buffer(priv);
>
> if (back->last_swap != 0)
> return 0;
> else
> return priv->swap_count - back->last_swap;
>
OK, you mean "== 0" right? Otherwise this does not make much sense ;)
>
> ...then check whether it's NULL. tsk tsk. :)
>
> I think the declaration of pdraw (with the #ifdef stuff) should get
> moved up to the previous else (just before calling _XRead). Then this
> block could be 'if (pdraw != NULL) {" instead of just "{". Then you can
> delete the other 'pdraw != NULL' checks (below). That refactor should
> go in a separate patch.
I am not sure I understand this. The ifdefs will be needed anyway because
otherwise __GLXDRIdrawable is not defined. Or should I simply do something like:
#ifdef ...
__GLXDRIdrawable *pdraw = ...;
#else
void *pdraw = NULL;
#endif
...
if (pdraw != NULL) ....
?
>> + psc->driScreen->queryBufferAge != NULL) {
>> +
>> + *value = psc->driScreen->queryBufferAge (pdraw);
> ^
> No space here.
>
OK.
>> + }
>>
>> if (pdraw != NULL && !pdraw->textureTarget)
>> pdraw->textureTarget =
>> diff --git a/src/glx/glxclient.h b/src/glx/glxclient.h
>> index a7118af..9333bdf 100644
>> --- a/src/glx/glxclient.h
>> +++ b/src/glx/glxclient.h
>> @@ -125,6 +125,7 @@ struct __GLXDRIscreenRec {
>> int64_t *msc, int64_t *sbc);
>> int (*setSwapInterval)(__GLXDRIdrawable *pdraw, int interval);
>> int (*getSwapInterval)(__GLXDRIdrawable *pdraw);
>> + int (*queryBufferAge)(__GLXDRIdrawable *pdraw);
> All of the other query functions here are getSomething. This should be
> getBufferAge for consistency.
>
OK.
>> };
>>
>> struct __GLXDRIdrawableRec
>> diff --git a/src/glx/glxextensions.c b/src/glx/glxextensions.c
>> index f186c13..78c4a8d 100644
>> --- a/src/glx/glxextensions.c
>> +++ b/src/glx/glxextensions.c
>> @@ -103,6 +103,7 @@ static const struct extension_info
>> known_glx_extensions[] = {
>> { GLX(SGIX_visual_select_group), VER(0,0), Y, Y, N, N },
>> { GLX(EXT_texture_from_pixmap), VER(0,0), Y, N, N, N },
>> { GLX(INTEL_swap_event), VER(0,0), Y, N, N, N },
>> + { GLX(EXT_buffer_age), VER(1,4), Y, N, Y, Y },
> ^^^
> 0,0
>
> Or GLX 1.4 won't be advertised on drivers that don't support this extension.
OK.
>> { NULL }
>> };
>>
>> diff --git a/src/glx/glxextensions.h b/src/glx/glxextensions.h
>> index 8436a94..37e4ccc 100644
>> --- a/src/glx/glxextensions.h
>> +++ b/src/glx/glxextensions.h
>> @@ -66,6 +66,7 @@ enum
>> SGIX_visual_select_group_bit,
>> EXT_texture_from_pixmap_bit,
>> INTEL_swap_event_bit,
>> + EXT_buffer_age_bit,
>> };
>>
>> /* From the GLX perspective, the ARB and EXT extensions are identical.
>> Use a
>> diff --git a/src/mesa/drivers/x11/glxapi.c b/src/mesa/drivers/x11/glxapi.c
>> index dfa8946..af3d208 100644
>> --- a/src/mesa/drivers/x11/glxapi.c
>> +++ b/src/mesa/drivers/x11/glxapi.c
>> @@ -1181,6 +1181,9 @@ _glxapi_get_extensions(void)
>> #ifdef GLX_INTEL_swap_event
>> "GLX_INTEL_swap_event",
>> #endif
>> +#ifdef GLX_INTEL_buffer_age
>> + "GLX_INTEL_buffer_age",
>> +#endif
> This hunk shouldn't exist. GLX_INTEL_swap_event should be listed here
> either. This is a fake GLX layer for a software rasterizer build of
> Mesa. These aren't the droids you're looking for...
OK will remove.
>> NULL
>> };
>> return extensions;
> I think you're going to end up with 3 paches:
>
> 1. Import new glxext.h.
> 2. Refactor the pdraw != NULL stuff, delete the stray NULL checks, etc.
> 3. Implement GLX_EXT_buffer_age
>
Will do, thanks for the review.
Adel
More information about the mesa-dev
mailing list