20210306, 12:40  #12 
"Tucker Kao"
Jan 2020
Head Base M168202123
1412_{8} Posts 

20210306, 14:50  #13 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
2·19·181 Posts 
TF depth decisions in GIMPS are properly made based on relative performance of TF, P1, and PRP on the device doing the TF, which should whenever practical be a GPU, with high TF performance relative to the other computation types.
100M exponent TF 7778 306.08 GHD https://www.mersenne.ca/credit.php?exponent=100000000&frombits=77&tobits=78&worktype=TF 100M exponent PRP or LL 381.39 GHD https://www.mersenne.ca/credit.php?e...78&worktype=LL Radeon VII TF 1113 Ghd/day, 3.712 GHD/d/W, 300 W TDP https://www.mersenne.ca/mfaktc.php PRP 296 GHD/day, 1.053 GHD/d/W, 300 W TDP https://www.mersenne.ca/cudalucas.php ratio TF/LL per day = 1113/296 = 3.76; log2(3.76) ~+2 bits more TF depth than for cpu factoring 306.08/1113 = 0.275 days = 6.6 hours TF 7778 estimate on 100M exponent 381.39/296 = 1.29 days LL estimate on gpu This gpu should be used on PRP/GEC/proof or P1 with a recent version of gpuowl, not on TF GTX780 TF 344 Ghd/day, 1.377 GHD/d/W, 250 W TDP https://www.mersenne.ca/mfaktc.php LL 38.7 GHD/day, 0.166 GHD/d/W, 250 W TDP https://www.mersenne.ca/cudalucas.php ratio TF/LL per day = 344/38.7 = 8.89; log2(8.89) ~+3 bits more TF depth than for cpu factoring 306.08/344 = 0.89 days = 21.35 hours TF 7778 estimate 381.39/38.7 = 9.85 days estimate to LL on gpu PRP/GEC/proof with recent gpuowl version may be a better use of this gpu than TF. The low TF and LL GHD/d/W figures make this ~8 year old gpu a candidate for replacement or nonuse on GIMPS. https://www.techpowerup.com/gpuspec...gtx780.c1701 Perhaps replace with a GTX1650, 75W TDP and save a lot on power costs; similar TF/LL ratio to the RTX2080. At $0.12/kwhr, 24/7 running, each watt of load reduction returns $1/year, paying for the upgrade in a bit over a year, a very good return on investment. GTX1070 TF 747.6, 4.88, 150 LL 47.8, 0.34, 150 ratio TF/LL per day = 747.6/47.8 = 15.64; log2(15.64) ~+4 bits more TF depth than for cpu factoring this gpu probably should be used on TF not P1 or primality testing RTX2080 TF 2623.5, 12.202, 215 LL 60.5, 0.301, 215 ratio TF/LL per day = 2623.5 / 60.5 = 43.36; log2(43.36) ~5.4 bits more TF depth than for cpu factoring 306.08/2623.5 = 0.117 days to TF 7778 estimate for 100M exponent (2.8 hours) 381.39/60.5= 6.3 days estimate to LL This gpu should be used on TF, not P1 or primality testing LL performance above and at mersenne.ca is reflective of CUDALucas, cllucas and very early gpuowl. Faster gpuowl PRP performance with recent versions than the above shifts the tradeoff point toward less TF depth than the above, by almost 1 bit level. (In the case of Radeon VII, log2(525/296) ~0.83 bits lower.) Last fiddled with by kriesel on 20210306 at 14:51 
20210306, 15:20  #14  
Jun 2003
5,407 Posts 
Quote:
Last fiddled with by axn on 20210306 at 15:21 

20210308, 11:04  #15  
"Tucker Kao"
Jan 2020
Head Base M168202123
2×389 Posts 
Quote:
M103,353,143 https://www.mersenne.org/report_expo...exp_hi=&full=1 M103,378,153 https://www.mersenne.org/report_expo...exp_hi=&full=1 M103,427,143 https://www.mersenne.org/report_expo...exp_hi=&full=1 M103,674,113 https://www.mersenne.org/report_expo...exp_hi=&full=1 M103,737,103 and M103,737,143 and M103,737,173 https://www.mersenne.org/report_expo...3737173&full=1 Last fiddled with by tuckerkao on 20210308 at 11:09 

20210308, 11:13  #16  
"Viliam Furík"
Jul 2018
Martin, Slovakia
2^{2}×3×5×13 Posts 
Quote:
He is willing to run PRP tests for GIMPS. Not for you. 

20210308, 11:15  #17 
"Tucker Kao"
Jan 2020
Head Base M168202123
2×389 Posts 
Ben Delo skipped M103,737,157 and M103,737,167, this cannot be the coincidence. As far as I understand, he goes by the numerical orders, not jumping around during the regular situation especially when these.2 exponents are not assigned to other people.
Okay, I finished up the P1 factoring for M103,358,341 around 6 minutes ago and Ben Delo just picked it up several seconds before I edit this comment, he didn't take any other exponents between M103,358,000 and M103,359.000  https://www.mersenne.org/report_expo...exp_hi=&full=1 If these were the random server assignments, he would be on numerous exponents across the several thousand blocks, not only on my guesses. Unless you are informing me that I have the invisible fingers which can magically type those exponents into Ben Delo's computers through the Space Impaler thousands of miles away. Last fiddled with by tuckerkao on 20210308 at 11:57 
20210308, 14:05  #18 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
2·19·181 Posts 
https://en.wikipedia.org/wiki/Grandiose_delusions
https://rationalwiki.org/wiki/Don%27t_feed_the_Troll Ben Delo and many others run mprime or prime95 via PrimeNet connection. It's likely Ben does not see much less select his many many assignments. Billionaires have far more enjoyable ways to spend their time. For throughput of order Ben Delo or curtisc level, lots of scripting runs the show. 
20210308, 15:08  #19  
"Viliam Furík"
Jul 2018
Martin, Slovakia
2^{2}·3·5·13 Posts 
Quote:
He gets what PrimeNet gives him, please, correct your knowledge. 

20210308, 15:14  #20 
"Alexander"
Nov 2008
The Alamo City
839_{10} Posts 
Unless any GIMPS users happen to know you personally, the only contributors to the project who know anything about you are the unfortunate forumites who have to sit here and watch you bloviate about your supposed hold on our top crunchers, who needless to say don't fall into either category. Nobody in GIMPS is your personal PRP tester. We contribute because we want to, and we help others (when we do) because we want to. Ben Delo just does a ton of PRP tests, which happens to include your exponents. He has no idea he's doing your exponents. He doesn't know who you are. He's just doing a huge service to GIMPS and the mathematical community. To PrimeNet, you are just another user (don't get me wrong, every useful GHzday helps), and the fact that you did part of the TF has no bearing on the future assignments of those exponents.
Last fiddled with by Happy5214 on 20210308 at 15:24 Reason: Expand 
20210308, 21:53  #21  
"Tucker Kao"
Jan 2020
Head Base M168202123
2×389 Posts 
Quote:
https://www.mersenne.org/report_expo...exp_hi=&full=1 I'll try to finish up the P1 factoring for more of my exponent guesses, so the server will assign them for more billionaire PRP tests, cannot miss the opportunities. Last fiddled with by tuckerkao on 20210308 at 22:09 

20210308, 22:08  #22  
"Viliam Furík"
Jul 2018
Martin, Slovakia
2^{2}·3·5·13 Posts 
Quote:


Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Automatic fetch of Trial Factoring work for GPU mfakt*  LaurV  GPU to 72  81  20201202 05:17 
Simple Script to get Trial Factoring Work  jfamestad  PrimeNet  3  20161106 20:32 
Why trial factoring work chopped into chunks?  lidocorc  PrimeNet  4  20081106 18:48 
How does the trial factoring work with 15K*2^n1  jocelynl  15k Search  0  20030711 14:23 
How does trialfactoring work?  ThomRuley  Software  5  20030530 20:34 