mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2017-06-07, 21:07   #1
evoflash
 
Dec 2012

2×13 Posts
Default CudaLucas Residual

Hello everyone, I'm starting out with CudaLucas, and ran a trial factor test to see if was working ok. So far so good. However when I ran a LL-double check I had residual of 0x0000000000000002 and received an error message at 100% which was confounding.

I put this down to high overclocking and restarted another check with a milder overclock that I use for stable gaming. I'm now getting patches of similar residue (see attachment), is this normal or am I wasting time/gpu watts?

Thanks for looking.
Attached Thumbnails
Click image for larger version

Name:	Capture.PNG
Views:	171
Size:	136.7 KB
ID:	16207  
evoflash is offline   Reply With Quote
Old 2017-06-07, 21:23   #2
GP2
 
GP2's Avatar
 
Sep 2003

5×11×47 Posts
Default

Quote:
Originally Posted by evoflash View Post
Hello everyone, I'm starting out with CudaLucas, and ran a trial factor test to see if was working ok. So far so good. However when I ran a LL-double check I had residual of 0x0000000000000002 and received an error message at 100% which was confounding.

I put this down to high overclocking and restarted another check with a milder overclock that I use for stable gaming. I'm now getting patches of similar residue (see attachment), is this normal or am I wasting time/gpu watts?

Thanks for looking.
A residue of 0x0000000000000002 is going to remain at that value until the end of the test, so there's no point in continuing. You should abort the test.

Except yours didn't, it actually switched to something else before reverting back to 0x0000000000000002 again. (!)

There is a whole lot of wrong happening here. You should get your computer checked out.
GP2 is offline   Reply With Quote
Old 2017-06-07, 23:39   #3
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

1015810 Posts
Default

What hardware are you running, what frequencies for GPU and VRAM?
kladner is offline   Reply With Quote
Old 2017-06-08, 07:00   #4
evoflash
 
Dec 2012

328 Posts
Default

Yeah, I suspected it wasn't looking good thanks for advice.

I'm running a GTX 970 at ~1490Mhz core clock and 3900Mhz VRAM. I'll kill the test, drop the overclock to base clocks and try again.
evoflash is offline   Reply With Quote
Old 2017-06-08, 07:26   #5
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

12FD16 Posts
Default

typical experience around here is the memory clock is more correlated with errors; you may even have to underclock the mem frequency a bit to get GPGPU stability.

Don't be surprised if you find a stable setting that OC's the processor a bit but runs memory at stock or a bit under.
VBCurtis is offline   Reply With Quote
Old 2017-06-08, 08:24   #6
evoflash
 
Dec 2012

328 Posts
Default

Great yes I'll do that. Thanks for tips. I will set it running tonight with underclock memory and see where it gets to.
evoflash is offline   Reply With Quote
Old 2017-06-08, 15:27   #7
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

2·3·1,693 Posts
Default

Quote:
Originally Posted by VBCurtis View Post
typical experience around here is the memory clock is more correlated with errors; you may even have to underclock the mem frequency a bit to get GPGPU stability.

Don't be surprised if you find a stable setting that OC's the processor a bit but runs memory at stock or a bit under.
+1
kladner is offline   Reply With Quote
Old 2017-06-10, 11:53   #8
evoflash
 
Dec 2012

1A16 Posts
Default

Quick update to say all looks good. Dropped GPU clock to base setting and bottomed out the memory clock, no errors so far. Thanks for advice.
Attached Thumbnails
Click image for larger version

Name:	Capture.PNG
Views:	136
Size:	548.4 KB
ID:	16221  
evoflash is offline   Reply With Quote
Old 2017-06-10, 13:05   #9
Mark Rose
 
Mark Rose's Avatar
 
"/X\(‘-‘)/X\"
Jan 2013

22·733 Posts
Default

Quote:
Originally Posted by evoflash View Post
Quick update to say all looks good. Dropped GPU clock to base setting and bottomed out the memory clock, no errors so far. Thanks for advice.
You'll want to run your memory clock faster than that, but it's a matter of finding out how high you can go before you get failures.
Mark Rose is offline   Reply With Quote
Old 2017-06-10, 17:13   #10
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

3·29·83 Posts
Default

Just to emphasize the point, many consumer-grade graphics cards are shipped with defective memory, i.e. memory that will cause CUDALucas errors at stock speed. This is because for *graphics* it doesn't matter if a bit or ten is wrong, it won't noticeably effect the on-screen rendered visuals that these cards are *assumed* to be used for.

CUDALucas, of course, like any LL testing software, is quite the opposite of error tolerant, and so any single bit failure *will* render the test completely useless.

Hence the need to sometimes underclock consumer graphics card memory. If you buy one of the professional cards, which are a lot more expensive, you can and should expect utterly flawless memory since they are designed for computation, unlike consumer graphics cards.
Dubslow is offline   Reply With Quote
Old 2017-06-11, 04:42   #11
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

7×1,373 Posts
Default

Quote:
Originally Posted by Dubslow View Post
Just to emphasize the point, many consumer-grade graphics cards are shipped with defective memory, i.e. memory that will cause CUDALucas errors at stock speed. This is because for *graphics* it doesn't matter if a bit or ten is wrong, it won't noticeably effect the on-screen rendered visuals that these cards are *assumed* to be used for.

CUDALucas, of course, like any LL testing software, is quite the opposite of error tolerant, and so any single bit failure *will* render the test completely useless.

Hence the need to sometimes underclock consumer graphics card memory. If you buy one of the professional cards, which are a lot more expensive, you can and should expect utterly flawless memory since they are designed for computation, unlike consumer graphics cards.
+1. You just explained very well and in fewer words what I am trying to say for a long time.
LaurV is offline   Reply With Quote
Reply



Similar Threads
Thread Thread Starter Forum Replies Last Post
Don't DC/LL them with CudaLucas LaurV Data 131 2017-05-02 18:41
Prime95 vs. CUDALucas storm5510 Software 6 2016-11-28 17:36
CUDALucas gives all-zero residues fivemack GPU Computing 4 2016-07-21 15:49
CUDALucas: which binary to use? Karl M Johnson GPU Computing 15 2015-10-13 04:44
Primes in residual classes Unregistered Information & Answers 6 2008-09-11 12:57

All times are UTC. The time now is 04:36.


Sat Jul 17 04:36:45 UTC 2021 up 50 days, 2:24, 1 user, load averages: 2.30, 2.21, 2.23

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.