[Bug 207693] amdgpu: RX 5500 XT boost frequency out of spec

bugzilla-daemon at bugzilla.kernel.org bugzilla-daemon at bugzilla.kernel.org
Tue May 12 13:26:32 UTC 2020


--- Comment #2 from Jan Ziak (http://atom-symbol.net) (0xe2.0x9a.0x9b at gmail.com) ---
(In reply to Alex Deucher from comment #1)
> The vbios defines the clock frequencies and nominal voltages, not the
> driver.  The voltage is changed dynamically at runtime based on frequency
> and power and individual board leakage so you will see slight variations at
> runtime depending on the board.

particlefire from Vulkan demos (https://github.com/SaschaWillems/Vulkan) is an
app with a relatively high power consumption (higher power consumption than
Aida64 GPU stability test). On my machine&display it has performance of about
1000 FPS in a maximized window. I let it run for about 20 minutes, during which
I manipulated GPU's fan speed.

According to /usr/bin/sensors, the GPU's junction/hotspot critical temperature
is 99°C. So I lowered the fan RPM to less than 1000 in order to achieve higher
temperatures. Even when the hotspot temperature was 105°C (6°C above critical)
and GPU edge temperature was 86°C it had no effect on the FPS of particlefire
(still about 1000 FPS).

radeontop (https://github.com/clbr/radeontop) was showing 1885MHz all the time
during the testing.

In summary, I am unable to confirm your claim that the GPU is self-adjusting
its voltage or frequency in Linux.

If you know an alternative approach (other than the one described above) to
verify that the GPU is dynamically changing voltage and frequency in Linux due
to temperatures and power consumption, please let me know.

You are receiving this mail because:
You are watching the assignee of the bug.

More information about the dri-devel mailing list