[Intel-gfx] [i845G] stuck in 1024x768
Alan W. Irwin
irwin at beluga.phys.uvic.ca
Sun Jun 13 17:40:17 PDT 2010
On 2010-06-13 19:13-0400 Felix Miata wrote:
> On 2010/06/13 23:10 (GMT+0100) Andy Lutomirski composed:
>> On Jun 13, 2010, at 9:49 PM, Felix Miata <mrmazda at earthlink.net> wrote:
>>> Using openSUSE 11.3M7 (1.8.0 server/2.11.0 AFAICT) I've been unable
>>> to figure
>>> out how to get the server to obey xorg.conf entries for NoDDC,
>>> PreferredMode or DisplaySize.
>>> xorg.conf as last modified by me:
Here is how I configured PreferredMode _for an old server_ (Debian Lenny)
in the Monitor section:
#gtf 1024 768 85
# 1024x768 @ 85.00 Hz (GTF) hsync: 68.60 kHz; pclk: 94.39 MHz
Modeline "1024x768_85.00" 94.39 1024 1088 1200 1376 768 769 772 807 -HSync + Vsync
Option "PreferredMode" "1024x768_85.00"
I found in the past that PreferredMode would not work with Intel if you used
a standard modeline name such as "PreferredMode" "1600x1200" like you do in
your xorg.conf. Instead, I suggest you use gtf to calculate a 1600x1200
modeline, and use the generated non-standard modeline name with a suffix
corresponding to the vertical refresh rate. No guarantees, but specifying a
special modeline like above with a non-standard modeline name was the only
way I could get PreferredMode to work in the past, and it is possible those
constraints on PreferredMode still apply for modern X servers. Anyhow, it
is worth a try.
> Just as importantly, I've not yet figured out why anyone should have to do
> manually (presuming they can even figure out how) what used to work
> automatically. For years, no modelines were in xorg.conf were required, and X
> just used the first usable entry on the applicable modes line in 'Section
> "Screen"'. Later someone decided a PreferredMode entry in 'Section "Monitor"'
> was required to perform the same function, but now it no longer works.
> Supposedly the overhaul of X begun two years ago was to make
> operation/startup/configuration (?more?) automatic, not less, but I, always
> using Trinitron CRTs, have only observed quite the contrary so far. X for me
> has regressed from the jet age back to piston engined biplanes without
> electric starters.
Like you, I hope the Intel jet age comes back soon.
By the way, my 15-year old Trinitron finally gave up the ghost earlier this
year and I replaced it with an LED-backlit ASUS LCD for $130. That was a
superb deal, and I can say that new monitor is better in all respects
(brightness, colours, resolution, and size) except for width of viewing
angle. However, I doubt very much it will last as long as Trinitrons do; the
Trinitrons are not that much worse in quality; and "use it up, wear it out"
philosophy helps the environment (and the bank balance). Thus, I am hanging
on to my remaining 10-yr old Trinitron monitor until it also dies. That
particular monitor is attached to a computer with SIS video chipset (ugh),
but when that computer fails I won't replace it with Intel unless
PreferredMode works properly. So I hope my suggestion above works for you,
but if not, I hope the Intel developers get PreferredMode working
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
More information about the Intel-gfx