Two Radeon Cards, Dell Dimension, Linux and x.org
Thorsten Becker
thorsten.becker at gmx.de
Tue Mar 21 12:04:32 PST 2006
Le Jeudi 16 Mars 2006 16:00, Thorsten Becker a écrit :
> Xorg 7.0.0, kernel 2.6.15-gentoo-r1
>
> Hello,
>
> To create a dual seat system I bought a Readeon 7000 PCI card for my Dell
> Dimension 4550, which has a Radeon 9700 already installed. The Problem is:
> I can only get one card to work at a time. Trying to start an X server on
> the "wrong card" results in locking up the system (it doesn't even write an
> Xorg.0.log logfile)
>
> What I tried so far:
>
> BIOS settings: The somewhat crippled BIOS in the dell system only has a few
> settings that could be relevant: "Primary Video Controller" is one of them,
> it can be set to AGP or Auto.
> If set to AGP, the AGP card works, the vga console is shown on the Monitor
> that is connected to that card. I can start an X server for that card, and
> it works.
> If set to Auto, the PCI card works, and I can start an X server for that
> card.
>
> But If it is set to AGP, and I try to start an X server for the PCI card,
> the result is a complete system lockup, same the other way round.
>
> I tried to get some useful debugging output by starting X for the "wrong
> card" via ssh and get some information via the -verbose Option. Such a log
> can be found here:
> http://www.tuxdesk.de/Xremote.log
>
> The xorg.conf can be found here:
> http://www.tuxdesk.de/xorg.conf.2120
>
> One time a short Xorg log was written. I out it here:
> http://www.tuxdesk.de/Xorg.0.log
Now I managed to get a longer log:
http://www.tuxdesk.de/Xorg.1.log
Last things I see:
===snip
II) Loading /usr/lib/xorg/modules/libvgahw.so
(II) Module vgahw: vendor="X.Org Foundation"
compiled for 7.0.0, module version = 0.1.0
ABI class: X.Org Video Driver, version 0.8
(II) RADEON(0): vgaHWGetIOBase: hwp->IOBase is 0x03b0, hwp->PIOOffset is
0x0000
(==) RADEON(0): RGB weight 565
(II) RADEON(0): Using 6 bits per RGB (8 bit DAC)
(II) Loading sub module "int10"
(II) LoadModule: "int10"
(II) Loading /usr/lib/xorg/modules/libint10.so
(II) Module int10: vendor="X.Org Foundation"
compiled for 7.0.0, module version = 1.0.0
ABI class: X.Org Video Driver, version 0.8
(II) RADEON(0): initializing int10
(**) RADEON(0): Option "NoINT10" "true"
(--) RADEON(0): Chipset: "ATI Radeon VE/7000 QY (AGP/PCI)" (ChipID = 0x5159)
(--) RADEON(0): Linear framebuffer at 0xe0000000
(--) RADEON(0): VideoRAM: 8192 kByte (64 bit SDR SDRAM)
(II) RADEON(0): PCI card detected
(**) RADEON(0): Forced into PCI mode
(II) RADEON(0): Color tiling enabled by default
(II) Loading sub module "ddc"
(II) LoadModule: "ddc"
(II) Loading /usr/lib/xorg/modules/libddc.so
(II) Module ddc: vendor="X.Org Foundation"
compiled for 7.0.0, module version = 1.0.0
ABI class: X.Org Video Driver, version 0.8
(II) Loading sub module "i2c"
(II) LoadModule: "i2c"
(II) Loading /usr/lib/xorg/modules/libi2c.so
(II) Module i2c: vendor="X.Org Foundation"
compiled for 7.0.0, module version = 1.2.0
ABI class: X.Org Video Driver, version 0.8
(II) RADEON(0): I2C bus "DDC" initialized.
===snip
So I think there might be something strange happening when the I2C bus is
initialized. If I get everything correct I2C is only needed to communicate
with the monitor, so to rule everything out I would like to disable it. But
setting the Option "NoDDC" to "yes" did not help.
Any ideas?
Thorsten
More information about the xorg
mailing list