xf86-video-ati-6.7.195, no DVI output

James C. Georgas jgeorgas at georgas.ca
Fri Oct 26 07:39:05 PDT 2007


On Fri, 2007-26-10 at 09:32 -0400, Alex Deucher wrote:
> Is the edid mode actually bad or is it a result of the edid sync
> polarity parsing bug in xserver 1.3?  If your edid is wrong, we should
> add a quirk to it in the xserver so the bad mode(s) can be fixed up
> automatically.  FWIW, I've fixed the IgnoreEDID option in the radeon
> driver.
> 

Well, I'm using git xorg, so I think it must be the EDID mode. I don't
know how to read a hex EDID block, so I can't tell if the data is wrong,
or if the driver is just parsing it wrong. It's only an issue with a
digital signal. Analog is fine.

When I first got the monitor, I tested it on a friend's Windows box, and
I still saw the static and sync issues. I also tested a different brand
of monitor, and the other monitor had no troubles with the same
modeline. I tested the montior with an nvidia card, using the
proprietary nvidia driver, and the nvidia driver reported EDID max
pixclock as 155MHz, compared to 170MHz, reported by both radeon and
fglrx.

I would expect the EDID data to have different pixclock ranges for
analog and digital signals, but I haven't seen any server log output
that indicates that this is the case.

In any case, I already posted some requested log output to the "Is
IgnoreEdid broken?" thread, so that a quirk could be added for the
Viewsonic VP201s.





More information about the xorg mailing list