Apple 23" Cinema HD Display vs. radeon
Benjamin Herrenschmidt
benh at kernel.crashing.org
Thu Mar 3 21:22:03 PST 2005
Ok, so I finally found some of the causes for the problems with the
Apple 23" Cinema HD Display and X.org.
Some are a bit weird, tho, I don't know if Hui can shed some light here,
some are more "classical".
- First, a weird one (somebody with some DDC HW spec ?): The monitor
switches to power managed state after the DDC probing done by radeonfb
or X.org. After tweaking around, it seems that our DDC code will leave
the clock and data lines "asserted" to low state. (The DDC GPIO
registers containing 0x00030000). Just "releasing" them (writing 0) and
suddently, the monitor comes back... (and we read 0x00000300 in there).
So I wonder if there is something wrong in our i2c DDC code...
- Then, a nasty one: I tested on 2 machines, a G5 with an rv350 (or
rv360 if you beleive X PCI IDs) and an iMac Mini with an rv280. On both
one, X seem to put different values in TMDS_PLL_CNTL (based on the table
built into the driver) than what the firmware uses. On the rv280, it
doesn't make that much difference (and we always get green noise on the
edge of the screen, even in MacOS btw...), on the R300 it definitely
makes one. X writes 0x1fbb01cb
there, causing the display to regulary "jump", while the firmware sets
1fbb0155. Looking at the table in X, I see:
{{15000, 0xb0155}, {0xffffffff, 0xb01cb}, {0, 0}, {0, 0}}, /*CHIP_FAMILY_RV350*/
Obviously here, the first value is correct, the second one is not. The
actual pixel clock obtained from the EDID, afaik, is 154Mhz
(**) RADEON(0): *Mode "1920x1200": 154.0 MHz (scaled from 0.0 MHz), 74.0 kHz, 59.9 Hz
(II) RADEON(0): Modeline "1920x1200" 153.97 1920 1968 2000 2080 1200 1203 1209 1235
So I wonder... is the "15000" in the table a bit too small ?
- Finally, last but not least ... on the rv280 iMac Mini, X.org won't see
the monitor EDID at all.... because it's using DDC_CRT2 ! The machine has a
single DVI connector, which seem to have analog on TV_DAC (weird, not internal
DAC), DDC on DDC_CRT2, and digital on internal TMDS. On the G5, it works
thanks to the "ReverseDDC" option I added that flips DVI_DDC and VGA_DDC since
they seem to be opposite to the default. That leads to my main problem lately:
I need a way to know the output mapping on the "Mac" cards (either motherboard
chips or PCI/AGP cards). Can ATI help here ? Should I ask devrel (again) ? Or
build the table myself by trial & error and having users test various
configurations until I get something sensible ? The table should be based on
the card's Open Firmware name (which I can retreive easily from radeonfb, and
I'm thinking about a way to pass that to X).
- Ok, that was not really "finally" :) But that one is more a fact than a
question: On the iMac Mini with the rv280, I always get some green noise on the
right edge of this display. Wether the mode is set by the firmware driver, by
us, or in MacOS X. The noise is visible on black, and tend to disappear if you
display "light" things (but still, it comes back as soon as you settle down on
black). I haven't yet tried mucking around with the pixel clock, I suspect we
are simply on the limit of the inernal TMDS transmitter ...
Regards,
Ben.
More information about the xorg
mailing list