<html>
<head>
<base href="https://bugs.freedesktop.org/" />
</head>
<body>
<p>
<div>
<b><a class="bz_bug_link
bz_status_NEW "
title="NEW - bfgminer --scrypt on 7xxx+"
href="https://bugs.freedesktop.org/show_bug.cgi?id=72785#c36">Comment # 36</a>
on <a class="bz_bug_link
bz_status_NEW "
title="NEW - bfgminer --scrypt on 7xxx+"
href="https://bugs.freedesktop.org/show_bug.cgi?id=72785">bug 72785</a>
from <span class="vcard"><a class="email" href="mailto:linux-user-2015@yandex.com" title="Linux User <linux-user-2015@yandex.com>"> <span class="fn">Linux User</span></a>
</span></b>
<pre>Got similar issues when trying scrypt. I have relatively recent bfgminer (~1
month old git build) and more or less recent graphics stack: 3.17 kernel, mesa
10.4 pre-release from Oibaf PPA and LLVM 3.5.1
I have bunch of HD5000 based cards, mostly HD5750/5770 and somesuch and R9 270.
Tests have shown bfgminer performs correctly when computing SHA256 and goes
about 80% of catalyst at 57xx and even better than that on R9 270.
However things are much worse when it comes to scrypt. I got impression open
driver can get issues if something aggressively using GPU VRAM.
Some observations about open drivers stack so far:
- If you set scrypt intensity too high, there is high risk GPU would lock up in
fatal way and crash (can happen on both 57xx and R9).
- It is also exceptionally unsafe to try sha256 with vectors=4 on HD 57xx. It
would be much slower than other vector settings anyway, but still indication of
some techmical problem lurking around.
- If I'm setting up intensity at reasonable levels, it does builds kernels and
computes BUT --benchmark ***NEVER*** accepts computed blocks, 100% reject rate.
If I reduce intensity - error rate goes down. But still no accepted blocks. If
I try CPU at same time, it computes several blocks at time frame where GPU
gives no results at all. This indicates computations are just going wrong.
p.s. IMO bfgminer is a really worthy program to add it into automated
tests/regression checks, etc.
P.P.S. and what about better fan/intensity control? Problem is that if ambient
is warm, about 25-26C, 57xx cards can make really annoying noise. Heat can be
reduced via intensity a bit but it works poorly and not really fine grained.
There is also overheat + hysteresis setting which can be (ab)used to cool down
GPU a bit but it makes fan to speed up and down in oscillating manner which is
also really annoying to hear.
Any proper tooling to hint DPM about max acceptable cooler rate or max allowed
GPU freq comparable to what Catalyst haves to offer in ADL? Preferred attitude
would be to reduce GPU core clocks if ambient is high and increase it if
ambient temp is low and TDP/fan setup permits. Its possible to achieve with
Catalyst, but Catalyst really stinks and really I want to get rid of it.</pre>
</div>
</p>
<hr>
<span>You are receiving this mail because:</span>
<ul>
<li>You are the assignee for the bug.</li>
</ul>
</body>
</html>