[ANNOUNCE] xf86-video-nv 2.1.9
Aaron Plattner
aplattner at nvidia.com
Mon May 12 22:05:35 PDT 2008
On Mon, May 12, 2008 at 03:25:09PM -0400, Adam Jackson wrote:
> On Fri, 2008-05-09 at 19:06 -0700, Aaron Plattner wrote:
>
> > This release adds some new product names to the list, fixes startup hangs on a
> > couple of GPUs, and adds an option -- AllowDualLinkModes -- to enable validation
> > of dual-link DVI modes. Note that not all GPUs are configured at boot to
> > support dual-link modes, so enable this option at your own risk.
>
> In fact, I've yet to see any G80 at all that powers up in dual-link
> mode. Even my FX4600, which you'd think has enough other power domains
> that dual link would be the least of its worries.
It varies a lot, and you'd be surprised at which devices can and can't do
dual-link out of the box. For example, it works on my MacBook and not on
my GeForce 8800 GTX. It depends heavily on the clock configuration at
boot, which doesn't really correspond in any meaningful way to how beefy
the chip is.
> AFAICT, the SOR setup path doesn't actually change output timings, just
> a scaler in front of the output itself. Which would mean you never
Untrue (for TMDS outputs, at least). The mode->Crtc* values are used as
the back-end timings and the non-Crtc versions are the front-end timings.
See, for example, ComputeAspectScale in g80_display.c.
> actually transition between single-link and dual-link. So it would be
> pleasant if we could just check whether the output backend is already in
> dual-link mode, rather than have to play with magic config options. Of
> course my understanding of the SOR path could be completely wrong.
More information about the xorg-announce
mailing list