mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2017-08-26, 19:51   #1
ric
 
ric's Avatar
 
Jul 2004
Milan, Ita

22·32·5 Posts
Default mfaktc, linux & laptop temperatures

Got a new laptop, a "gaming" one (GPU + reinforced fans), with an Nvidia GTX 1050Ti, Debian derivative (Mint 18.2), Nvidia drivers 375.82 (CUDA 8.0), and I've been playing around with mfaktc, with the intention to give a try to CUDALucas as well.
Laptop is positioned on one of those aluminium stands, to enhance airflow. With default mfaktc.ini settings, I'm having temps around 54-56°C for the CPU and 70-71°C for the GPU, with fans spinning at full throttle and no other application around.
If it were a desktop machine, I'd probably be fine with these readings.

I've then being playing with mfaktc.ini settings in order to reduce average temps, since I wouldn't want to fry this new toy, too (more on this in a soon-to -be Unhappy Me posting):
  • as a first attempt, I've modified NumStreams=1 (from default 3) and managed to lower GPU temps by 1-2°C - not enough, for my taste
  • I've then tried to relocate sieving on the CPU (SieveOnGPU=0), and this lowered GPU temps around 60-65°, but at the same time raised CPU temps around 78-80°C - most probably unsafe, in the mid-long term
  • finally, I also haven't seen a parameter equivalent to mprime's undocumented "Throttle=n", and thus my educated guess is that such parameter does not exist for mfaktc

Thus, I'm looking for recommendations: is there another way to trade-off throughput with more reasonable CPU/GPU temperatures? Or my idea of running GPU crunching on a laptop, albeit a "gaming" one, is a recipe for a soon-to-be silicon BBQ?

TIA, r.
ric is offline   Reply With Quote
Old 2017-08-26, 22:27   #2
Mark Rose
 
Mark Rose's Avatar
 
"/X\(‘-‘)/X\"
Jan 2013

32·11·29 Posts
Default

70° is not unreasonable for the GPU.
Mark Rose is offline   Reply With Quote
Old 2017-08-27, 01:26   #3
thyw
 
Feb 2016
! North_America

2×5×7 Posts
Default

It's fine, but if it worsens over time (few years) you can try downclocking the gpu. I don't know the impact of memory or core on mfaktc, but memory may not have a big impact on it. (not sure about the bottlenecks)
I'm pretty sure CudaLucas is memory dependent and generates way more heat than mfaktc.

Last fiddled with by thyw on 2017-08-27 at 01:28
thyw is online now   Reply With Quote
Old 2017-08-27, 02:03   #4
Mark Rose
 
Mark Rose's Avatar
 
"/X\(‘-‘)/X\"
Jan 2013

32×11×29 Posts
Default

Quote:
Originally Posted by thyw View Post
It's fine, but if it worsens over time (few years) you can try downclocking the gpu. I don't know the impact of memory or core on mfaktc, but memory may not have a big impact on it. (not sure about the bottlenecks)
I'm pretty sure CudaLucas is memory dependent and generates way more heat than mfaktc.
mfaktc is entirely core clock. Memory makes no difference.
Mark Rose is offline   Reply With Quote
Old 2017-08-27, 08:53   #5
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

34×109 Posts
Default

Quote:
Originally Posted by thyw View Post
I'm pretty sure CudaLucas is memory dependent and generates way more heat than mfaktc.
Actually, the other way around. CuLu waits for memory, and in that time, the silicon cools. For all my cards where CuLu can run, the temperature will be 1-2-5 degrees lower than what mfaktc can push.
LaurV is offline   Reply With Quote
Old 2017-08-27, 10:13   #6
ric
 
ric's Avatar
 
Jul 2004
Milan, Ita

22·32·5 Posts
Default

Quote:
Originally Posted by Mark Rose View Post
70° is not unreasonable for the GPU.
Yes, that's what I understand from other posts around here. I'm just a bit concerned about sustainability 24x7 (keyboard is between warm and hot, just above the GPU).

Quote:
Originally Posted by thyw View Post
It's fine, <snip> you can try downclocking the gpu
This is what I'd be looking after: how do you do it? (no options in BIOS, and Nvidia control panel has no clear options for this, the closest being a PowerMizer (sp?) with "adaptive" and "max performance", currently set at "adaptive").
I understand that Win version might be more customizable in this.

Quote:
Originally Posted by LaurV View Post
For all my cards where CuLu can run, the temperature will be 1-2-5 degrees lower than what mfaktc can push.
I'll definitively give it a try.
ric is offline   Reply With Quote
Old 2017-08-27, 18:53   #7
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

5·1,997 Posts
Default

I find CuLu runs cooler, but had a different theory as to why. On a GTX 1060, the (overclocked) GPU only reaches 80% power, regardless of the power limit set in Afterburner. This suggests to me that the throttled FPU keeps the whole card in check.
kladner is offline   Reply With Quote
Old 2017-08-31, 15:03   #8
kriesel
 
kriesel's Avatar
 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest

22·32·127 Posts
Default

Quote:
Originally Posted by ric View Post
Yes, that's what I understand from other posts around here. I'm just a bit concerned about sustainability 24x7 (keyboard is between warm and hot, just above the GPU).



This is what I'd be looking after: how do you do it? (no options in BIOS, and Nvidia control panel has no clear options for this, the closest being a PowerMizer (sp?) with "adaptive" and "max performance", currently set at "adaptive").
I understand that Win version might be more customizable in this.



I'll definitively give it a try.
On Windows, try MSI Afterburner or EVGA Precision XOC to make adjustments. GPU-Z or HW-Monitor are possibilities to monitor your gpus without making changes. Note, I have seen occasionally GPUs go back to factory default clock from the underclocked value I set, perhaps on program restart, even while XOC is still running. Keep an eye on it.

The linux side seems less promising. https://devtalk.nvidia.com/default/t...-under-linux-/
kriesel is offline   Reply With Quote
Old 2017-08-31, 20:01   #9
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

5·1,997 Posts
Default

There are folks around here who have gotten extended control under Linux via a GPU BIOS mod.
kladner is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to run MFAKTC in Linux? Rodrigo GPU Computing 52 2020-01-24 16:37
mfaktc on Linux and misfit on Windows bgbeuning GPU Computing 3 2016-01-25 05:20
GTX 580 temperatures Mark Rose GPU Computing 7 2014-09-03 16:48
Worried about my temperatures.. Unregistered Information & Answers 25 2011-12-30 21:42
What CPU Temperatures Are Acceptable? Rodrigo Information & Answers 9 2010-08-05 18:47

All times are UTC. The time now is 05:37.

Wed Oct 21 05:37:17 UTC 2020 up 41 days, 2:48, 0 users, load averages: 1.28, 1.40, 1.42

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.