![]() |
|
|
#45 | |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
10100110011012 Posts |
Quote:
|
|
|
|
|
|
|
#46 |
|
Einyen
Dec 2003
Denmark
D7C16 Posts |
It does not really say anything about fan settings. Maximum: 88 C might mean that it shuts down at 88 C.
You should really try MSI Afterburner it is very easy: https://www.msi.com/Landing/afterburner http://download.msi.com/uti_exe/vga/...etup_4.6.0.zip Just install and hit the cog icon for settings, go to the "Fan" tab and tick "Enable user defined software automatic fan control" and set your fan speed curve like you want it. I have mine to hit 100% fan speed at 82C and 80% at 73C but you can experiment a bit. I do not recommend to have Fan update speed too frequent which might wear out your fan, I have it at 30000ms (30 sec). I have "Temperature hysteresis" at 2 C which is like a buffer to also not change the fan too frequently. It means that if it hits say 70C and set the fan speed according to the graph, it will not change the fan speed unless the temperature reach either 72C or 68C, so if it flips back and forth between 69-70 or 70-71 it will not continually change fan speed all the time. Last fiddled with by ATH on 2018-12-01 at 07:23 |
|
|
|
|
|
#47 |
|
Undefined
"The unspeakable one"
Jun 2006
My evil lair
6,793 Posts |
My fan runs at 0% at only 80C. I push it to 10% at 85C, 50% at 90C and 100% at 95C. It has been like for the last 15 years, still works fine.
|
|
|
|
|
|
#48 | |
|
Aug 2002
2×32×13×37 Posts |
Quote:
|
|
|
|
|
|
|
#49 | |
|
Einyen
Dec 2003
Denmark
345210 Posts |
I'm not 100% sure it does. I had a graphic cards where the fan fairly quickly wore out, it was the first card where I used a temperature curve to set automatic fan speed. It was with Nvidia System Tools which was not as advanced as Afterburner, and I noticed the fan was speeding up and slowing down often, so I assumed that's what caused it to wear out quicker.
Granted it was many years ago, I think it was a Geforce GTX 460, but I'm not the only one with that theory, I have read it on other forums or websites. Quote:
If you could cool it as good as you wanted, I think the best operating temperature in terms of speed would probably be like 30C-50C. |
|
|
|
|
|
|
#50 |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
3×52×71 Posts |
GhzDays. Took 25 days.
|
|
|
|
|
|
#51 |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
3·52·71 Posts |
RTX-20xx that is
Or Titan RTX. I'd like to know how it is performing and any learnings or recommendations? Last fiddled with by petrw1 on 2018-12-31 at 18:16 |
|
|
|
|
|
#52 |
|
"Sam Laur"
Dec 2018
Turku, Finland
317 Posts |
I got an RTX 2080 at work yesterday. Officially... for simple AI stuff, but it remains to be seen whether the application even makes sense in the end. But of course that hasn't stopped me from running some, ahem, commissioning tests before actual work begins, and I guess there will be plenty of spare GPU cycles available in the future, too.
Anyway, the platform I have is Debian Linux and currently there's not even a display connected to the machine. Maybe another choice would have been easier, but meh, it was another learning experience. (But in this case, any flavour of Windows wasn't an option.) Even in the "testing" branch of Debian, the NVidia drivers they offer are not new enough. 390.87 for the display driver and 9.1.85 for CUDA. So at the moment the only solution, besides changing distros, is to hack and slash NVidia supplied Linux drivers and install them into what they call an unsupported distribution (for example Ubuntu is supported, and even then, only some versions). And make sure that Debian doesn't mess up the installation at some point. So now I have driver version 410.93 and CUDA 10.0.130. Now, the CUDA package included some 410.x (earlier than .93) display driver, but there were all sorts of problems getting it to install properly; luckily, the separate display driver package caused no problems of its own. I haven't had that much time running it yet, so I really haven't learned much and can't recommend many, if any specific things. What I have compiled at the moment is CUDALucas 2.06Beta (svn 102 from Sourceforge) and mfaktc 0.21. Both perform pretty much as expected, and in the future I expect to concentrate on running mfaktc, because the CUDALucas performance just isn't that good. A short session of power tweaking also produced expected results; it is possible to get about 3000 GHz-d/d (GPUSieveSize=128) when running the card at the default 215W setting, but even a slight reduction in the performance target reduces the power consumption quite a bit. And the closer you get to the default setting, the less you gain. So at the moment I have set a power limit of 200W and locked the SM clock at 1725 MHz. In actual mfaktc use it will then take about 170W and produce a bit over 2800 GHz-d/d. It's fun for sure watching shorter assignments fly by in the terminal window
|
|
|
|
|
|
#53 | |
|
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
172208 Posts |
Quote:
![]() Have you run a thorough (maximum footprint and multiple passes) cudalucas -memtest yet? (I saw a huge difference in tests a year apart on the same gpu. Date stamp and log your test results.) Started or completed an LL double-check assignment? |
|
|
|
|
|
|
#54 |
|
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
782410 Posts |
|
|
|
|
|
|
#55 | |
|
"Sam Laur"
Dec 2018
Turku, Finland
317 Posts |
Quote:
But the official status for NVidia OpenCL has been, for a long time now, "1.2 with some 2.0 features in beta". I guess they don't put much effort behind it, and probably for marketing / business reasons, not technical... |
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Nvidia GTX 745 4GB ??? | petrw1 | GPU Computing | 3 | 2016-08-02 15:23 |
| Nvidia Pascal, a third of DP | firejuggler | GPU Computing | 12 | 2016-02-23 06:55 |
| AMD + Nvidia | TheMawn | GPU Computing | 7 | 2013-07-01 14:08 |
| Nvidia Kepler | Brain | GPU Computing | 149 | 2013-02-17 08:05 |
| What can I do with my nvidia GPU? | Surge | Software | 4 | 2010-09-29 11:36 |