UDL device cannot get its own screen

Böszörményi Zoltán zboszor at pr.hu
Tue Nov 12 14:23:13 UTC 2019


2019. 11. 05. 15:22 keltezéssel, Böszörményi Zoltán írta:
> Hi,
> 
> 2019. 10. 23. 15:32 keltezéssel, Ilia Mirkin írta:
>> On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán <zboszor at pr.hu> wrote:
>>>
>>> 2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:
>>>> On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor at pr.hu> wrote:
>>>>> Section "Device"
>>>>>           Identifier      "UDL"
>>>>>           Driver          "modesetting"
>>>>>           Option          "kmsdev" "/dev/dri/card0"
>>>>>           Screen          2
>>>>>           Option          "Monitor-DVI-I-1-1" "DVI-I-1-1"
>>>>
>>>> I think you have an extra -1 in here (and the monitor name doesn't
>>>> exist as per above). And I think the "Screen" index is wrong -- it's
>>>> not what one tends to think it is, as I recall. I think you can just
>>>> drop these lines though.
>>>
>>> Without "Screen N" lines, all the outputs are assigned to :0
>>> so the screen layout setup in the ServerLayout section is not
>>> applied properly.
>>>
>>
>> As I remember it, the Screen here is for ZaphodHeads-type
>> configurations, and it indicates which head you're supposed to use of
>> the underlying device. My suggestion was to only remove it here, not
>> everywhere.
> 
> Okay, but it still doesn't create a working setup.

So, finally I got back into experimenting with this.

I have read "man 5 xorg.conf" more closely and found option
GPUDevice in Section "Screen". Here's the configuration I came up
with but it still doesn't work:

==============================================
Section "ServerFlags"
	Option		"AutoBindGPU" "false"
EndSection

Section "Monitor"
	Identifier	"Monitor-DP-1"
	Option		"AutoServerLayout" "on"
	Option		"Rotate" "normal"
EndSection

Section "Monitor"
	Identifier	"Monitor-VGA-1"
	Option		"AutoServerLayout" "on"
	Option		"Rotate" "normal"
EndSection

Section "Monitor"
	Identifier	"Monitor-HDMI-1"
	Option		"AutoServerLayout" "on"
	Option		"Rotate" "normal"
EndSection

Section "Monitor"
	Identifier	"Monitor-DVI-I-1"
	Option		"AutoServerLayout" "on"
	Option		"Rotate" "normal"
EndSection

Section "Device"
	Identifier	"Intel0"
	Driver		"modesetting"
	BusID		"PCI:0:2:0"
	Screen		0
	Option		"Monitor-DP-1" "DP-1"
	Option		"ZaphodHeads" "DP-1"
EndSection

Section "Device"
	Identifier	"Intel1"
	Driver		"modesetting"
	BusID		"PCI:0:2:0"
	Screen		1
	Option		"Monitor-VGA-1" "VGA-1"
	Option		"ZaphodHeads" "VGA-1"
EndSection

Section "Device"
	Identifier	"Intel2"
	Driver		"modesetting"
	BusID		"PCI:0:2:0"
	Screen		2
	Option		"Monitor-HDMI-1" "HDMI-1"
	Option		"ZaphodHeads" "HDMI-1"
EndSection

Section "Device"
	Identifier	"UDL"
	Driver		"modesetting"
	Option		"kmsdev" "/dev/dri/card0"
	# Suggestion of Ilia Mirkin: Don't set Screen here
	#Screen		2
	Option		"Monitor-DVI-I-1" "DVI-I-1"
	Option		"ZaphodHeads" "DVI-I-1"
EndSection

Section "Screen"
	Identifier	"SCREEN"
	Option		"AutoServerLayout" "on"
	Device		"Intel0"
	Monitor		"Monitor-DP1"
	SubSection	"Display"
		Modes	"1024x768"
		Depth	24
	EndSubSection
EndSection

Section "Screen"
	Identifier	"SCREEN1"
	Option		"AutoServerLayout" "on"
	Device		"Intel1"
	Monitor		"Monitor-VGA1"
	SubSection	"Display"
		Modes	"1024x768"
		Depth	24
	EndSubSection
EndSection

Section "Screen"
	Identifier	"SCREEN2"
	Option		"AutoServerLayout" "on"
	Device		"UDL"
	GPUDevice	"Intel2"
	Monitor		"Monitor-DVI-I-1"
	SubSection	"Display"
		Modes	"1024x768"
		Depth	24
	EndSubSection
EndSection

Section "ServerLayout"
	Identifier	"LAYOUT"
	Option		"AutoServerLayout" "on"
	Screen		0 "SCREEN"
	Screen		1 "SCREEN1" RightOf "SCREEN"
	Screen		2 "SCREEN2" RightOf "SCREEN1"
EndSection
==============================================

Obviously, I want *some* GPU acceleration that does its work
over the UDL framebuffer.

With the above setup, I get these:

# DISPLAY=:0 xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x40 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload 
crtcs: 1 outputs: 1 associated providers: 0 name:modesetting
Provider 1: id: 0xac cap: 0x2, Sink Output crtcs: 1 outputs: 1 associated providers: 0 
name:modesetting

# DISPLAY=:0.1 xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x72 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload 
crtcs: 1 outputs: 1 associated providers: 0 name:modesetting

# DISPLAY=:0.2 xrandr --listproviders
Can't open display :0.2

According to /var/log/Xorg.0.log, I have:

[  1917.884] (II) modeset(0): using drv /dev/dri/card1
[  1917.884] (II) modeset(1): using drv /dev/dri/card1
[  1917.884] (II) modeset(G0): using drv /dev/dri/card1
[  1917.884] (II) modeset(G1): using drv /dev/dri/card0

modeset(0) is the Intel DP-1 output, monitor attached, EDID detected
modeset(1) is the Intel VGA-1 output, monitor attached, EDID detected
modeset(G0) is the Intel HDMI-1 output, no monitor, no EDID
modeset(G1) is the UDL device, monitor attached, EDID detected

However:

[  1918.521] (II) modeset(G0): Damage tracking initialized
[  1918.525] (II) modeset(0): Damage tracking initialized
[  1918.525] (II) modeset(0): Setting screen physical size to 270 x 203
[  1918.528] (II) modeset(1): Damage tracking initialized
[  1918.528] (II) modeset(1): Setting screen physical size to 270 x 203

Most notably, there is no "modeset(G1): Setting screen physical size" message,
despite EDID was detected, and supposedly properly, since the monitor name
is right.

FYI, inverting the roles of UDL and Intel2 does not work either, i.e. using:

	Device		"Intel2"
	GPUDevice	"UDL"

This way, at least there's a DISPLAY=:0.2 screen (Intel2) but still
no working UDL. Also, xrandr --listproviders insists on showing the UDL
provider line for DISPLAY=:0 instead of DISPLAY=:0.2.

> In the meantime I switched to the GIT version of Xorg, but
> it didn't make a difference (for now).

Current GIT commit 562c7888be538c4d043ec1f374a9d9afa0b305a4, plus applied MRs:
* 155 (USB device prefix handling),
* 325 (reorder ScreenInit) and
* 326 (more robust hotplug GPU handling).

I have also this patch to see whether the auto-bound GPUs actually have
their screen numbers set correctly according to the ServerLayout section:

diff --git a/hw/xfree86/common/xf86Init.c b/hw/xfree86/common/xf86Init.c
index 6cc2f0b01..3e21644fe 100644
--- a/hw/xfree86/common/xf86Init.c
+++ b/hw/xfree86/common/xf86Init.c
@@ -210,9 +210,29 @@ xf86AutoConfigOutputDevices(void)
      if (!xf86Info.autoBindGPU)
          return;

-    for (i = 0; i < xf86NumGPUScreens; i++)
+    xf86ErrorFVerb(0, "xf86AutoConfigOutputDevices: xf86NumScreens %d xf86NumGPUScreens 
%d\n", xf86NumScreens, xf86NumGPUScreens);
+    for (i = 0; i < xf86NumGPUScreens; i++) {
+        xf86ErrorFVerb(0,
+        	"xf86AutoConfigOutputDevices: GPU #%d driver '%s' '%s' "
+        	"scrnIndex %d origIndex %d pScreen->myNum %d confScreen->screennum %d "
+        	"confScreen->device->identifier '%s' "
+        	"confScreen->device->screen %d confScreen->device->myScreenSection->screennum %d "
+        	"confScreen->device->myScreenSection->device->screen %d\n",
+        				i,
+        				xf86GPUScreens[i]->driverName,
+        				xf86GPUScreens[i]->name,
+        				xf86GPUScreens[i]->scrnIndex,
+        				xf86GPUScreens[i]->origIndex,
+        				xf86GPUScreens[i]->pScreen->myNum,
+        				xf86GPUScreens[i]->confScreen->screennum,
+        				xf86GPUScreens[i]->confScreen->device->identifier,
+        				xf86GPUScreens[i]->confScreen->device->screen,
+        				xf86GPUScreens[i]->confScreen->device->myScreenSection->screennum,
+        				xf86GPUScreens[i]->confScreen->device->myScreenSection->device->screen
+        				);
          RRProviderAutoConfigGpuScreen(xf86ScrnToScreen(xf86GPUScreens[i]),
                                        xf86ScrnToScreen(xf86Screens[0]));
+    }
  }

  static void

But no, all GPU devices (now only one, the UDL device) have screen 0
(a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:

[  2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
[  2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex 
256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0 
confScreen->device->identifier 'Intel0'
  confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0 
confScreen->device->myScreenSection->device->screen 0

Somehow, Option "Device" should ensure that the UDL device is actually
treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
instead of modeset(Gn)) and it should be woken up automatically.

This is what AutoBindGPU is supposed to do, isn't it?

But instead of assigning to screen 0, it should be assigned to whatever
screen number it is configured as.

I know it's not a common use case nowadays, but I really want separate
fullscreen apps on their independent screens, including a standalone UDL
device, instead of having the latters as a Xinerama extension to some
other device.

Best regards,
Zoltán Böszörményi


More information about the dri-devel mailing list