20210609, 22:28  #542 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
5×7×281 Posts 
List has been updated with a few new exponents.

20210611, 19:32  #543 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
266B_{16} Posts 
A new quad check of a 332M exponent is needed (or a PRP on it). 1 LL is suspect, the other 2 are not.
List is up to date. 
20210611, 20:30  #544 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
61·89 Posts 
FWIW (not much), the 0shift LL run on M332250197 had no Jacobi check errors indicated in the gpuowl.log. I'd like to see a matching LL on this one to help define 100Mdigit LL error rate better. But it is a substantial effort at ~4950. GHD.
Took: Cat 1 DoubleCheck=57106099,74,1 DoubleCheck=57974659,74,1 ETA within a week barring system or power issues. Last fiddled with by kriesel on 20210611 at 20:31 
20210611, 21:18  #545  
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
23153_{8} Posts 
Quote:
[When I looked up the details on this machine on PrimeNet to post it I found out it (and another borg) had 8 cores vs the 6 I thought. I changed the settings and am now employing all 8 of those.] Last fiddled with by Uncwilly on 20210611 at 21:27 

20210611, 22:06  #546 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
61·89 Posts 
30 TF from 78 to 81 bits at 100Mdigit? Or 30 P1 to adequate bounds?
Last fiddled with by kriesel on 20210611 at 22:28 
20210611, 22:46  #547  
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
5·7·281 Posts 
Quote:
https://www.mersenne.org/report_fact...=1&tftobits=81 

20210611, 23:46  #548  
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
61×89 Posts 
Quote:
I've reserved these 31 in P1 and will also TF the survivors as below. Code:
Factor=332309569,78,81 Factor=332327231,78,81 Factor=332327243,78,81 Factor=332327371,78,81 Factor=332327581,78,81 Factor=332327629,78,81 Factor=332327659,78,81 Factor=332327683,78,81 Factor=332327729,78,81 Factor=332327833,78,81 Factor=332327899,78,81 Factor=332328103,78,81 Factor=332328287,78,81 Factor=332328439,78,81 Factor=332328527,78,81 Factor=332328539,78,81 Factor=332328547,78,81 Factor=332328551,78,81 Factor=332329031,78,81 Factor=332329391,78,81 Factor=332329493,78,81 Factor=332330993,78,81 Factor=332331001,78,81 Factor=332331059,78,81 Factor=332331061,78,81 Factor=332331127,78,81 Factor=332331161,78,81 Factor=332331217,78,81 Factor=332331331,78,81 Factor=332331353,78,81 Factor=332331361,78,81 Last fiddled with by kriesel on 20210611 at 23:53 

20210612, 00:13  #549 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
10011001101011_{2} Posts 
Cool. I expect on Tuesday the estimate of when the LL gets done will be better. This is a relatively new borg so the rolling average is not perfect. I moved the Cat 1 DC's it was doing to another machine.

20210614, 13:39  #550 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
61·89 Posts 
Took:
Cat 1 DoubleCheck=57826103,74,1 DoubleCheck=57836777,74,1 The estimated run time is ~11 days in Gpuowl on a RadeonVII for the 100Mdigit P1 batch of 31, and one factor has already been found, not yet reported; TF on a GTX1080Ti is projected as about a month so will be split with another GPU. Last fiddled with by kriesel on 20210614 at 13:46 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Posts that seem less than useless, or something like that  jasong  Forum Feedback  1050  20190429 00:50 
Posts in limbo  10metreh  Forum Feedback  6  20130110 09:50 
Ton of spam posts  jasonp  Forum Feedback  9  20090719 17:35 
Exponents assigned to me but not processed yet?  edorajh  Data  10  20031118 11:26 
2000 posts!  Xyzzy  Lounge  10  20021121 00:04 