30-bpp mode for dummy - exposes a bug somewhere else?
Antoine Martin
antoine at nagafix.co.uk
Fri Sep 16 09:58:27 UTC 2016
Hi,
Adding support for 10 bits per pixel mode to the dummy driver:
-- a/src/dummy_driver.c
+++ b/src/dummy_driver.c
@@ -313,6 +313,7 @@ DUMMYPreInit(ScrnInfoPtr pScrn, int flags)
case 15:
case 16:
case 24:
+ case 30:
break;
default:
xf86DrvMsg(pScrn->scrnIndex, X_ERROR,
@@ -327,8 +328,8 @@ DUMMYPreInit(ScrnInfoPtr pScrn, int flags)
pScrn->rgbBits = 8;
/* Get the depth24 pixmap format */
- if (pScrn->depth == 24 && pix24bpp == 0)
- pix24bpp = xf86GetBppFromDepth(pScrn, 24);
+ if (pScrn->depth >= 24 && pix24bpp == 0)
+ pix24bpp = xf86GetBppFromDepth(pScrn, pScrn->depth);
/*
* This must happen after pScrn->display has been set because
This looks simple enough and works very well, stable. Unfortunately,
this also causes any calls to RandR to crash the server:
Program received signal SIGSEGV, Segmentation fault.
0x00007fddc2a78b98 in DUMMYLoadPalette (pScrn=<optimized out>,
numColors=<optimized out>, indices=<optimized out>, colors=0xd01a80,
pVisual=<optimized out>)
at dummy_driver.c:513
513 dPtr->colors[index].red = colors[index].red << shift;
(gdb) bt
#0 0x00007fddc2a78b98 in DUMMYLoadPalette (pScrn=<optimized out>,
numColors=<optimized out>, indices=<optimized out>, colors=0xd01a80,
pVisual=<optimized out>)
at dummy_driver.c:513
#1 0x0000000000480fb2 in CMapRefreshColors ()
#2 0x00000000004816e8 in CMapReinstallMap ()
#3 0x00000000004817ca in CMapSwitchMode ()
#4 0x000000000047407a in xf86SwitchMode ()
#5 0x0000000000497a53 in xf86RandRSetMode ()
#6 0x000000000049808a in xf86RandRSetConfig ()
#7 0x00000000004fa398 in RRCrtcSet ()
#8 0x0000000000507a46 in ProcRRSetScreenConfig ()
#9 0x0000000000436daf in Dispatch ()
#10 0x000000000043add3 in dix_main ()
#11 0x00007fddc7fb8731 in __libc_start_main (main=0x424d20 <main>,
argc=19, argv=0x7ffc98ec5e18, init=<optimized out>, fini=<optimized
out>, rtld_fini=<optimized out>,
stack_end=0x7ffc98ec5e08) at ../csu/libc-start.c:289
#12 0x0000000000424d59 in _start ()
(gdb)
It's always crashing in palette or colormap functions, ie:
CMapDestroyColormap or DUMMYLoadPalette, etc.
Could it be that there's a bug somewhere else that is only being
triggered with 30-bpp modes? Trying to use more colours / space than is
available in the current colourmap perhaps?
I had temporarily prevented crashes with only some (!) randr resizings
by clamping the "numColors" value to 256 in DUMMYLoadPalette.
If you want to try it for yourself, just apply the 30-bpp patch at the
top of this file and run:
Xorg +extension RANDR -config xorg.conf :100
With this config:
http://xpra.org/trac/raw-attachment/ticket/909/xorg.conf
Then you can crash the server reliably with:
DISPLAY=:100 xrandr -s 640x480
I think I am out of my depth (pun), how do I debug from here?
Cheers
Antoine
More information about the xorg-devel
mailing list