[Intel-gfx] i915GM 2D+3D intel driver regression

Alan W. Irwin irwin at beluga.phys.uvic.ca
Fri Apr 30 20:28:03 CEST 2010


On 2010-04-30 08:45-0700 SD wrote:

> Dear all.
>
> I have been using linux for 2 years already. And I use it on intel 915 GM video card on Lenovo laptop. With:
> (II) Loading /usr/lib/xorg/modules//drivers/intel_drv.so
> (II) Module intel: vendor="X.Org Foundation"
> 	compiled for 1.5.2, module version = 2.5.0
> 	Module class: X.Org Video Driver
> 	ABI class: X.Org Video Driver, version 4.1
> And acceleration module XAA when I watch Xvid movie on full screen, according to TOP:
> X uses 4% of CPU
> SMplayer 10-12% of CPU.
>
> GLXgear gives me:
> 3216 frames in 5.0 seconds = 643.117 FPS
>
>
>
> Now I tried Fedora13 (test) with intel driver:
> [    27.854] (II) Loading /usr/lib/xorg/modules/drivers/intel_drv.so
> [    27.855] (II) Module intel: vendor="X.Org Foundation"
> [    27.855]  compiled for 1.8.0, module version = 2.11.0
> [    27.855]  Module class: X.Org Video Driver
> [    27.855]  ABI class: X.Org Video Driver, version 7.0
>
> And can you imagine GLXgear gives me:
> 165 frames in 5.1 seconds = 32.612 FPS
>
> Watching Xvid movie:
> X uses ~40% of CPU
> SMplayer ~10-20% of CPU.
> Even when I switch workspaces X11 uses ~20% of CPU - there is no any 2D acceleration at all
>
> After all of this I would like to ask:
> Do you respect customer who use linux?
> Does any one check your driver and UXA with i915?
> Why, just why developers through away XAA from driver, your UXA works the same as EXA did - awful. Awful with 3D and more important, awful with 2D.
>
> Why dev. can't just leave what was done good for i915?
> Why it was necessary to screw everything. Looks like you just put EXA to UXA.
>
> I do not know about other intel chipsets, but i915 works really slow with new and previous driver on UXA.
>
> So, for i915GM new driver is BIG BIG REGRESSION and big step backward.

Personally, I think you were a little hard on the Intel developers.  I think
we should all give them some slack so they have the freedom to get on with
the job of the huge X stack changes that have been necessary over the last
several years to deal with the capabilities of modern video chipsets
(including Intel ones).

However, I think those developers are entirely on your side that
_eventually_ these large X stack changes should be refined to the point that
they will not severely impact older hardware performance.  For example,
there have been reassurances in the past from the Intel developers on
exactly this point.  Clearly, from your xvid and smplayer numbers (they will
dismiss glxgears numbers for reasons that have been stated many times in the
past) they are currently doing poorly at this job, and that is quite
worrying.  For example, I am sticking to XAA for my older g33 Intel video
chipset using the Debian stable X stack because of speed and stability
concerns with the new X stack and new intel driver, and your post has
reaffirmed that decision.  But both of us (and all the other users of older
Intel hardware out there) cannot use old distributions forever so I hope
that the Intel developers are reassuring once again in answer to your post
that _soon_ (rather than "eventually") they will address the real-world (as
opposed to glxgears) performance regressions compared to the old X stack and
XAA.

Alan
__________________________
Alan W. Irwin

Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).

Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________

Linux-powered Science
__________________________



More information about the Intel-gfx mailing list