mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Data

Reply
 
Thread Tools
Old 2019-11-20, 01:03   #1
xx005fs
 
"Eric"
Jan 2018
USA

2·97 Posts
Default TFLOPS madness

I noticed that the TFLOPS is abnormally high today (at 286,000), and it was due to Ryan submitting a bunch of ECM results. Is that normal or did the server had some problems processing his results?
xx005fs is offline   Reply With Quote
Old 2019-11-20, 01:35   #2
Prime95
P90 years forever!
 
Prime95's Avatar
 
Aug 2002
Yeehaw, FL

22·7·241 Posts
Default

Quote:
Originally Posted by xx005fs View Post
I noticed that the TFLOPS is abnormally high today (at 286,000), and it was due to Ryan submitting a bunch of ECM results. Is that normal or did the server had some problems processing his results?
Normal. Using GMP-ECM on small Mersenne numbers gives inflated CPU credit.

Remember, CPU credit is given based on how long it would take prime95 to perform a task. GMP-ECM uses a superior stage 2 algorithm which gives much higher B2 bounds. This is similar to the inflated CPU credit for TF - GPUs are a better tool than prime95 on a CPU.
Prime95 is offline   Reply With Quote
Old 2019-11-21, 04:49   #3
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

2×5×7×61 Posts
Default

Ryan found a few dozen ECM factors for small exponents over the past year or so.
I assume he has recently made known exactly how many curves he ran for these.

The "Done" boxes are way more plentiful especially for the lower exponents.

https://www.mersenne.org/report_ecm/...=1&ecm_hi=3000
petrw1 is offline   Reply With Quote
Old 2019-11-21, 15:33   #4
storm5510
Random Account
 
storm5510's Avatar
 
Aug 2009
U.S.A.

2·7·79 Posts
Default

Quote:
Originally Posted by Prime95 View Post
...Using GMP-ECM on small Mersenne numbers gives inflated CPU credit...
Has something changed with GMP-ECM? The last time I used it, the results were not in a format compatible with PrimeNet.
storm5510 is offline   Reply With Quote
Old 2019-11-21, 15:49   #5
Gordon
 
Gordon's Avatar
 
Nov 2008

1111100012 Posts
Default

Quote:
Originally Posted by petrw1 View Post
Ryan found a few dozen ECM factors for small exponents over the past year or so.
I assume he has recently made known exactly how many curves he ran for these.

The "Done" boxes are way more plentiful especially for the lower exponents.

https://www.mersenne.org/report_ecm/...=1&ecm_hi=3000
Ohh....guess I should give up on M3049 now....even though there haven't been all *that* many curves at B1 in range 850M-1B the large number of lower bounds curves give the likelihood of a miss at 65 digits as less than 45%.
Gordon is offline   Reply With Quote
Old 2019-11-21, 16:25   #6
axn
 
axn's Avatar
 
Jun 2003

23×34×7 Posts
Default

Quote:
Originally Posted by Prime95 View Post
Normal. Using GMP-ECM on small Mersenne numbers gives inflated CPU credit.

Remember, CPU credit is given based on how long it would take prime95 to perform a task. GMP-ECM uses a superior stage 2 algorithm which gives much higher B2 bounds.
Probably time to change the way credit is calculated. One way would be to calculate the equivalent curve count of the standard 100x B2 and grant the corresponding credit. IIRC, default GMP-ECM B2 is roughly equivalent to 1.5x - 3x the curves as B2=100B1 count (depending on the B1 level).

This way, ECM done with GMP-ECM still receives higher credits, but not absurdly so.
axn is offline   Reply With Quote
Old 2019-11-21, 18:26   #7
Gordon
 
Gordon's Avatar
 
Nov 2008

7·71 Posts
Default

Quote:
Originally Posted by axn View Post
Probably time to change the way credit is calculated. One way would be to calculate the equivalent curve count of the standard 100x B2 and grant the corresponding credit. IIRC, default GMP-ECM B2 is roughly equivalent to 1.5x - 3x the curves as B2=100B1 count (depending on the B1 level).

This way, ECM done with GMP-ECM still receives higher credits, but not absurdly so.
I typically use B2 = 100,000 x B1
Gordon is offline   Reply With Quote
Old 2019-11-21, 20:18   #8
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

4,001 Posts
Default

Quote:
Originally Posted by Gordon View Post
I typically use B2 = 100,000 x B1
Why? B2 should scale more like B1-squared than B1; your ratio is correct for one particular B1, but much too small for larger B1 values.
VBCurtis is offline   Reply With Quote
Old 2019-11-21, 22:45   #9
Gordon
 
Gordon's Avatar
 
Nov 2008

7×71 Posts
Default

Quote:
Originally Posted by VBCurtis View Post
Why? B2 should scale more like B1-squared than B1; your ratio is correct for one particular B1, but much too small for larger B1 values.
Too small?

I did one curve at each B1 level (Prime 95), then ran gmp-ecm in verbose mode to time for each B2 going up in powers of 10 until I reached the point where run times were about equal'ish for stage 1 and stage 2.

Previous comments were having B2=100*B1....

..if I understand correctly then for say B1=260M I should have B2=6.76E16 ?
Gordon is offline   Reply With Quote
Old 2019-11-22, 00:07   #10
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

76418 Posts
Default

I think we're both a bit mistaken; I had ratio of 10,000 in mind (not the 100k you mentioned- I simply goofed when I glanced at B2/B1 ratios in my logs and compared to your claim) when I said your ratio is too small for large B1, but it's not too small until enormous B1 values. I'm also not experienced in your use of P95 for stage 1, and I lack much experience with finding the optimal ratio of B2 to B1 for cases that use P95 for stage1 and GMP-ECM for stage 2.
Regardless, the ratio should depend on B1. B2 = 100000 * B1 is too big for small B1s, and too small for (quite) large B1s.
For instance, I'm running curves presently on a C251 at B1 = 6e9, and B2 = 1.8e15 yields stage 2 time around 45% of stage 1 time. If you want stage 2 time to equal stage 1 time, you'd need a ratio close to 1 million. Yet, at B1 = 6e7 a B2 around 3e10 yields stage 2 time somewhere near stage 1 time; that's a ratio of just 5000.
So, if you determined B2 = 100000 * B1 is optimal for B1=260M, I believe you; but I suggest you not use that ratio for B1 = 1M!

Last fiddled with by VBCurtis on 2019-11-22 at 00:10 Reason: Fixed post to reflect Gordon's statement of P95 for stage 1, GMP stage 2
VBCurtis is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Vega 20 announced with 7.64 TFlops of FP64 M344587487 GPU Computing 4 2018-11-08 16:56
1000 TFLOPs ramgeis PrimeNet 2 2014-04-08 10:27
Supercomputer Blizzard wíth 158 TFLOPS online moebius Science & Technology 3 2010-12-14 10:45
100M madness stars10250 Hardware 8 2008-10-02 15:21
March Madness 2006 Prime95 Lounge 20 2006-03-21 04:35

All times are UTC. The time now is 22:45.

Sat Apr 4 22:45:24 UTC 2020 up 10 days, 20:18, 0 users, load averages: 1.35, 1.19, 1.39

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.