[Piglit] [PATCH 08/14] util: Disable piglit_glx_window_set_no_input when using Waffle

Chad Versace chad.versace at linux.intel.com
Tue May 22 17:04:03 PDT 2012


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 05/22/2012 08:06 AM, Pauli Nieminen wrote:
> On Mon, May 21, 2012 at 11:15:16PM -0700, Chad Versace wrote:
>> From: Chad Versace <chad at chad.versace.us>
>>
>> Use #ifdef to disable the function because, when using Waffle and Mesa,
>> XSetWMHints() causes the next emission of DRI2SwapBuffers to fail with
>> BadWindow(X_ChangeProperty). Comment this with FIXME.
>>
>> Signed-off-by: Chad Versace <chad at chad.versace.us>
>> ---
>>  tests/util/piglit-glx-util.c |    6 ++++++
>>  1 file changed, 6 insertions(+)
>>
>> diff --git a/tests/util/piglit-glx-util.c b/tests/util/piglit-glx-util.c
>> index df54c50..ad71790 100644
>> --- a/tests/util/piglit-glx-util.c
>> +++ b/tests/util/piglit-glx-util.c
>> @@ -224,6 +224,11 @@ piglit_iterate_visuals_event_loop(Display *dpy,
>>  void
>>  piglit_glx_window_set_no_input(Display *dpy, GLXDrawable win)
>>  {
>> +	/* FIXME: When using Waffle and Mesa, this function causes the next
>> +	 * FIXME: emission DRI2SwapBuffers to fail with
>> +	 * FIXME: BadWindow(X_ChangeProperty).
>> +	 */
> 
> I checked the protocol error using xtrace.
> 
> Problem is that XChangeProperty hets resource 2 as window parameter
> when window is resouce 4. The resource is returned from
> glXGetCurrentDrawable but I didn't try to figure out why it returns
> wrong id with waffle. 
> 
> Protocol log doesn't have anything that is named as resource 2 but
> xtrace doesn't know how decode glx calls. But DRI2GetBuffers has
> resource 4 as parameter which means that client side at least knows
> abotu corrrect resource id.

When I investigated this, I discovered that the following happens
regardless if Waffle is in use.

piglit_glx_set_no_input()
{
    GLXDrawable win;
    ...
    win = glXGetCurrentDrawable();
 
   // win == 0x1a00002
   //
   // Digging into Waffle reveals that this is the XID
   // of the X window created with xcb_create_window().
   // However, when Waffle is not used, win still has
   // the same value.
}

// Later...

DRI2SwapBuffers(XID drawable)
{
    // drawable == 0x0x1a00004
    //
    // Digging into Waffle reveals that this is the XID
    // returned by glXCreateWindow(win=0x1a00002). However,
    // when Waffle is not used, drawable still has the same value.

    ...
    LockDisplay(dpy);
    GetReq(DRI2Swapbuffers, req);
    // BOOM. BadWindow is generated here if Waffle is used.
}

My suspicion is that the problem is caused by a disagreement between
xcb and Xlib. GLUT uses pure Xlib, and Waffle uses xcb whenver possible.
In Waffle, I followed the guidelines here http://xcb.freedesktop.org/opengl/
on how to intermix xcb with glX, but maybe I still did something wrong.

- ----
Chad Versace
chad.versace at linux.intel.com
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.19 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQIcBAEBAgAGBQJPvClxAAoJEAIvNt057x8iwCIQALJGknSh0XsbinmUiyvo92RA
mX/YOVjfuIQLTU4xYdEWypRsND8eHfbCEYcWcO+NxjADXx2eUoQpUT5xoECyWLxj
1YFRjjJ+E40RMPvw4zGAZeOPJLQc4HCuuVi+NZcrP7PKBuKHnWBkIfvKHOXRtTBP
bF1SgfYjqpjPP1pZDB/WfPl0cMYqRy4sJ0CX7T5D7z8wZ/a6CGWtqFQD6OYl3lu8
uTNRYeEaAgoDrX9w45FTCcL4Xy172QuRXVmoCyaUFVkaJEahoBlDM2xk8yTh85X4
GQ5s3EIIk7np/4fm+PajGeUKWrZXyRYpL9pxt5N5jwb+F35yQK5KLdr7sjoV3k8F
kDRb2SEA7/kS0jGzEZvQxNWEDSy0xsjGGTBO2xpPgQO0L1B3Uj/KFkMaZsd2dDtm
jtegpXOE4+LAAnO8kjwz2P0ElEbh2vx0msp0bosxkfimYezzRaLnA241hluF5r2z
HAKg7pkLfs3j0HN6D9YuCJMJOPurapcP4gPXAyUBaFN9lH8VrVWjHYMQ44XUQYyx
pC7x5Hbm24nWWbgVZ6lq1MkvCSOa7XGZAfTxtzobTvmifRy1HizKWCyZA++o42H6
GhlobPSkfMndCg8eXFdb+m5slLvaoDriCAaI1GLeMzNC6hI3ftIim31QZp++22Gv
A6z3iR0FLfX+QCwbF6KJ
=XsS5
-----END PGP SIGNATURE-----


More information about the Piglit mailing list