[Intel-gfx] possible quirks addition
Adam Jackson
ajax at redhat.com
Mon Oct 19 21:03:35 CEST 2009
On Mon, 2009-10-19 at 16:53 +0000, The Fungi wrote:
> On Mon, Oct 19, 2009 at 12:05:43PM -0400, Adam Jackson wrote:
> > What version of the X server and intel driver are you using?
>
> From /var/log/Xorg.0.log:
>
> X.Org X Server 1.6.4
> Release Date: 2009-9-27
> <snip>
> (II) Loading /usr/lib/xorg/modules/drivers//intel_drv.so
> (II) Module intel: vendor="X.Org Foundation"
> compiled for 1.6.4, module version = 2.9.0
> Module class: X.Org Video Driver
> ABI class: X.Org Video Driver, version 5.0
Those should be new enough to get the extended EDID blocks. The kernel
support could be broken, I suppose. Do you get a fuller EDID block in
RANDR when running UMS?
> I might not have been clear enough in my description here. I've
> actually broken into the "secret" Sony service technician menu on
> this set to readjust the picture inward so there is no overscanning
> for the modes I use. What seems to be happening here is that the
> kernel believes the 720x480 mode the TV advertises in its EDID
> block, but the television is then throwing away the left and right
> 40 pixel columns (leaving wide black "pillar box" bars in their
> place) to trim the width to 640 pixels, implying it thinks that 480p
> is only a 4:3 aspect format. Its 640x480 mode occupies the exact
> same region of the screen. Definitely can't blame overscanning for
> this one, I'm afraid.
That's pretty awesome. I repeat what I said earlier about TVs being
hateful though.
> > This is the real problem with your display though. It claims 1080i
> > as the preferred mode, but then that's getting filtered away for
> > some reason. I can't see any obvious reason for that in the kernel
> > code; does your X log say anything about "Not using mode [...]" ?
> >
> > If we got this right, then the 720x480 bug would be more or less
> > moot.
>
> No mention of the string "not using" in Xorg.0.log:
>
> kiosk at hastur:~$ grep -ci "not using" /var/log/Xorg.0.log
> 0
Ugh, need to say Option "ModeDebug" "on" in xorg.conf to get that back.
Still think turning that off was a mistake. However...
> > It's almost certain we're getting this wrong in more than one way
> > though. We're not correcting for the way TVs encode interlaced
> > modes to match the way X expects it internally (your TV is saying
> > "540 lines, interlaced up by 2", which is technically more honest,
> > but X expects it to be "1080 lines interlaced down by 2").
>
> Ahh, so interlacing *is* supposed to work? I had tried adding a
> bunch of ATSC standard interlaced modes the TV appears to support
> from other consumer A/V equipment, but was never able to get the
> driver/chipset to output anything the TV felt like displaying.
I don't know of any intrinsic reason why it shouldn't work on intel kit,
particularly for HDMI; but it does appear that the X driver never
explicitly turns on interlace support, and the kernel driver explicitly
turns it off. Lovely. So ModeDebug would just tell you interlace is
forbidden, so don't worry about trying to get that working.
It doesn't look too hard to add though. The description of the
PIPEACONF register in the 965 docs looks complete enough.
- ajax
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 198 bytes
Desc: This is a digitally signed message part
URL: <http://lists.freedesktop.org/archives/intel-gfx/attachments/20091019/823da280/attachment.sig>
More information about the Intel-gfx
mailing list