ATI Radeon video performance drop when laptop heating up

Kirby C. Bohling kbohling at birddog.com
Wed Jun 29 14:12:31 PDT 2005


On Wed, Jun 29, 2005 at 10:57:48PM +0200, Jean-Michel POURE wrote:
> Dear all,
> 
> I am using xorg 2.6.8 on an ASUS laptop with a Radeon Mobility M7 LW graphic 
> card. The laptop is configured with Debian + Ubuntu xorg packages. 
> 
> I was too lazy to install xorg from source.
> 
> xorg is configured with DRI and XV. Playing a DVD under xine only consumes 40% 
> of the computer activity. But when the laptop warms up, activity climbs to 
> 100%. 
> 
> The performance of OpenGL software (like glxgeras) stays the same, but 
> performance of video playback drops dramatically.
> 
> Are you aware of any relation between a laptop temperature and the performance 
> of a video card? Is there a way to playback video with good quality, even 
> when the laptop heats up?

P4 chips have an internal temperature sensor that dropped the speed
of the laptop by 50% as soon as it gets too hot.  

This link talks about it (I have no idea if I trust them, but they
describe what I remember reading other places back when the P4's
first came out):
http://www.hardwaresecrets.com/article/104

I'm not sure if that applies to all chips (including the Pentium
M's) or what.  However, I do know that what you are talking about
seems plausible.  From the consumes 40% to 100% when the temperature
changes, that sounds like the issue.  Especially if "glxgeras" isn't
CPU intensive (it would affect the DVD decode, but not glxgeras.
DVD decoding is CPU intensive, while in my hypothetical situation
glxgeras isn't).  I've never run glxgera's so I wouldn't know.

HTH
    Kirby



More information about the xorg mailing list