mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2015-01-10, 16:01   #78
Roy_Sirl
 
Roy_Sirl's Avatar
 
Sep 2002
Cornwall, UK

2·5 Posts
Default

For anyone interested here are the GPU-Z screenshots for my GTX 980 running mfaktc and then running CudaLucas.
Attached Thumbnails
Click image for larger version

Name:	mfaktc-gtx980.gif
Views:	103
Size:	20.9 KB
ID:	12178   Click image for larger version

Name:	culu-gtx980.gif
Views:	105
Size:	20.6 KB
ID:	12179  
Roy_Sirl is offline   Reply With Quote
Old 2015-01-10, 16:18   #79
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

67218 Posts
Default

Quote:
Originally Posted by Mark Rose View Post
Perhaps code changes could be made to reduce CUDALucas's FP64 usage.
Sorry, the LL test has got to have double precision if you're using floating point transforms. We (George and I) briefly considered using integer number-theoretic FFTs which would get the full throughput of the new GPUs, but I haven't seen a hardware configuration where that would be faster, in 15 years of trying. There's also the danger that in the time it takes to build GPU-optimized number-theoretic FFTs, the next model of GPU will be released which makes them unnecessary :)
jasonp is offline   Reply With Quote
Old 2015-01-10, 18:16   #80
stars10250
 
stars10250's Avatar
 
Jul 2008
San Francisco, CA

3·67 Posts
Default

Quote:
Originally Posted by Roy_Sirl View Post
For anyone interested here are the GPU-Z screenshots for my GTX 980 running mfaktc and then running CudaLucas.
So it's a voltage limit that's capping the power consumption at 72% when running cudalucas. I don't run into that.
stars10250 is offline   Reply With Quote
Old 2015-01-11, 04:46   #81
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

100111101011102 Posts
Default

Quote:
Originally Posted by Roy_Sirl View Post
For anyone interested here are the GPU-Z screenshots for my GTX 980 running mfaktc and then running CudaLucas.
That is very interesting. I do point out that the memory controller is running much harder for CuLu.
kladner is offline   Reply With Quote
Old 2015-01-11, 04:54   #82
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

2·3·5·313 Posts
Default

Hm.. What expo were you doing, which uses only 50% of the memory? Or is this normal for 980 cards?
My (unlocked) Titans maximize both the gpu and the memory when LL, when set to double precision.
LaurV is online now   Reply With Quote
Old 2015-01-11, 05:23   #83
axn
 
axn's Avatar
 
Jun 2003

11·449 Posts
Default

Quote:
Originally Posted by LaurV View Post
Hm.. What expo were you doing, which uses only 50% of the memory? Or is this normal for 980 cards?
My (unlocked) Titans maximize both the gpu and the memory when LL, when set to double precision.
Maxwells have large L2 cache (256KB vs 2MB) and much more crippled DP thruput.

Last fiddled with by axn on 2015-01-11 at 05:24
axn is offline   Reply With Quote
Old 2015-01-12, 00:19   #84
stars10250
 
stars10250's Avatar
 
Jul 2008
San Francisco, CA

3118 Posts
Default

780Ti CUDALucas GPU-Z, 69.5M, 4M FFT, 3.6 ms/iter
Attached Thumbnails
Click image for larger version

Name:	780Ti_GPUZ_CUDALucas.jpg
Views:	106
Size:	98.9 KB
ID:	12185  

Last fiddled with by stars10250 on 2015-01-12 at 00:44
stars10250 is offline   Reply With Quote
Old 2015-01-12, 10:04   #85
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

100100101011102 Posts
Default

Quote:
Originally Posted by stars10250 View Post
780Ti CUDALucas GPU-Z, 69.5M, 4M FFT, 3.6 ms/iter
For this card, can you switch to DP? (i.e. do you have the option in nvidia control center/panel?)
I don't believe that the clock is for a DP-enabled card. For Titan, when I enable DP, the clock goes down from ~1200 to ~850.
Just curious, thanks in advance.

Last fiddled with by LaurV on 2015-01-12 at 10:04 Reason: s/for/from/
LaurV is online now   Reply With Quote
Old 2015-01-12, 16:11   #86
stars10250
 
stars10250's Avatar
 
Jul 2008
San Francisco, CA

3·67 Posts
Default

Quote:
Originally Posted by LaurV View Post
For this card, can you switch to DP?
No. There is a CUDA setting but it's just for specifying which GPUs can be used by CUDA applications. Mine it set to use both video cards by default. I've read elsewhere that you do have to turn on dual precision with titan. It is disabled as it reduces performance in non DP applications.
stars10250 is offline   Reply With Quote
Old 2015-01-12, 20:38   #87
stars10250
 
stars10250's Avatar
 
Jul 2008
San Francisco, CA

3×67 Posts
Default

Quote:
Originally Posted by LaurV View Post
I don't believe that the clock is for a DP-enabled card.
It was running CUDALucas at the time of the screenshot and that makes extensive use of DP. I get the same clock numbers in MSI Afterburner overclock software, ASUS GPU tweak software, and in CPU-Z. I've seen the clock jump down when I unload it by turning off the CUDALucas calculation, but as soon as I launch the program it jumps back up to full speed.
stars10250 is offline   Reply With Quote
Old 2015-01-12, 20:42   #88
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

2×3×1,693 Posts
Default

Quote:
Originally Posted by stars10250 View Post
It was running CUDALucas at the time of the screenshot and that makes extensive use of DP. I get the same clock numbers in MSI Afterburner overclock software, ASUS GPU tweak software, and in CPU-Z. I've seen the clock jump down when I unload it by turning off the CUDALucas calculation, but as soon as I launch the program it jumps back up to full speed.
This is power saving kicking in. The clocks go down, as well as the voltage.
kladner is offline   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 14:33.

Fri Apr 23 14:33:07 UTC 2021 up 15 days, 9:13, 0 users, load averages: 2.31, 2.04, 2.03

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.