xf86-video-ati-6.7.195, no DVI output
alexdeucher at gmail.com
Fri Oct 26 06:32:56 PDT 2007
On 10/26/07, James C. Georgas <jgeorgas at georgas.ca> wrote:
> On Thu, 2007-25-10 at 21:26 -0400, Alex Deucher wrote:
> > On 10/25/07, James C. Georgas <jgeorgas at georgas.ca> wrote:
> > > BY the way, I tried his xvidtune fix, but I kept getting bad modeline
> > > errors, even with numbers that match the current mode. I guess this is
> > > another randr change?
> > Probably. I haven't looked at xvidtune, but I suspect it's not randr
> > aware. In theory it should work with the compatibility crtc, but I
> > haven't tested it.
> Anyway, I got my low bandwidth mode to be recognized. I found that I had
> to explicitly associate my monitor section with the video output:
> Option "Monitor-DVI-1" "VP201s"
> If I instead let the server associate the monitor section by default,
> then it would ignore my PreferredMode directive and my custom modeline
> in the monitor section. Go figure.
> Also, I had to give my custom modeline a custom name. If I tried to name
> it "1600x1200", in an attempt to override the bad EDID mode, the server
> would freeze on startup, black screen, remote kill -9 to recover. So I
> can augment the modelist, but not override it in the config file.
Is the edid mode actually bad or is it a result of the edid sync
polarity parsing bug in xserver 1.3? If your edid is wrong, we should
add a quirk to it in the xserver so the bad mode(s) can be fixed up
automatically. FWIW, I've fixed the IgnoreEDID option in the radeon
More information about the xorg