20141227, 09:15  #320 
May 2007
Kansas; USA
2×5^{2}×7×29 Posts 

20141227, 09:34  #321 
Jun 2009
3·223 Posts 

20141228, 03:19  #322 
May 2007
Kansas; USA
2×5^{2}×7×29 Posts 
Discussion about Riesel base 3 suggestions, sieving, and testing moved to that thread.

20150117, 15:13  #323 
Dec 2010
Ava, Missouri
28_{16} Posts 
Reserving R138 (8K's remaining)
Sieving for N 100250K to 500e9 Neo AtP 
20150117, 23:13  #324 
May 2007
Kansas; USA
2·5^{2}·7·29 Posts 

20150118, 12:23  #325  
Dec 2010
Ava, Missouri
2^{3}×5 Posts 
Quote:
It's my intention to LLR. I'm 84% done bringing the sieve to 500e9 and itching, just itching, to start LLR'ing.. I'm at a 6 seconds/factor removal rate. At least that's what Sr2sieve is reporting. I guess my hope was to hit a prime or two relatively early in the search and then continue sieving once tests started to get longer. I was thinking that Sr2sieve would run faster if it had a few less K's to sieve for, especially because I'm sieving for 8 K's over a wide range of N (100250K) I will take your advice though. To what depth should I take it? I am only using one core to sieve; are there any drawbacks/warnings against using the "t" switch to add some cores? Neo 

20150118, 12:50  #326  
Quasi Admin Thing
May 2005
393_{16} Posts 
Quote:
950 second per test / 6 seconds * 420G = 158.33 * 420G = 66.5 T, you might wanna test the testing time for n=250K for the highest k. However this will be very close to your optimal sievedepth (based on my experienced assumptions, on how long a 1.23M bit test takes). If you choose in the future to do your own calculations of optimal sievedepth, then this is a pretty good way to calculate the optimal sievedepth. And yes, you're right, sr2sieve will run faster if a k is removed from the sievefile, but you still have to sieve untill you at least hit the minimum time an LLR test takes at n=100K, else you might not gain as much progress as you desire. In regards to t, I honestly have no answer to you, since I never uses the t function, so someone else has to chime in on this KEP 

20150118, 13:24  #327 
Jun 2009
3×223 Posts 
From my personal experience you get the maximum performance by adding cores "by hand" i.e. starting several instances of sr2sieve, each searching its own range. If you want to save on the manual labor, I found that up to t4 the performance drawback is tolerable. For more cores I tend to divide the range I'm sieving. This has been evaluated quite some time back and might not be very accurate any more.

20150118, 15:35  #328  
Dec 2010
Ava, Missouri
2^{3}×5 Posts 
Quote:
I ran an LLR on 372*138^1000001 ... testing time was 210 seconds. I ran an LLR on 1742*138^2500001 ... testing time was 1,460 seconds. SO, once the sieve is finished at 500G, and using the above formula: 210 / 6 = 35 * 500,000,000,000 = 17.5T ??? Second question for you guys: I'm almost done (97% to 500e9) on the sieve. I've found 12,755 factors. Is there a benefit to using srfile to remove the composite K's (factors.txt) from the .abcd sieve file? Will the removal of the 12,755 K candidates speed up sr2sieve? If so, what command line do I use to remove composite K's from the abcd file while preserving the abcd file for further sieving? (Edited) srfile k knownfactors factors.txt ? I thank you advance for your assistance. There are tons of threads and messages dating back to 2009... it's hard to keep all this information in my brain, but I have honestly tried hard by rereading all the sr README's, threads, etc., ;) Last fiddled with by Neo on 20150118 at 16:01 

20150118, 16:23  #329 
"Lennart"
Jun 2007
10001100000_{2} Posts 
srfile k factors.txt sr_138.abcd G
G if you like to have a prpfile a if you like to have a abcd file Lennart 
20150118, 16:26  #330  
Quasi Admin Thing
May 2005
3×5×61 Posts 
Quote:
2. You will most definently benefit from using "srfile k factors.txt srsieve.out" and then "srfile a srsieve.out" since removing factors will be speeding up your sieving. There is no use, for CRUS and other primesearching projects, to keep finding factors for k's already been proven composite as result of sieving, since we don't need factors, only primes. My own addition: optimal sievedepth for the entire range n>100K to n<=250K is 1460 / 6 * 500G = 243,33 * 500 = 121.66T (85.17T for 70% sievedepth) Please notice, that 70% is in many instances a desired sievedepth and can due to the removal of candidates from primed k's, to some extent be justified as optimal sievedepth, for the kind of searching that CRUS does 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Bases 33100 reservations/statuses/primes  Siemelink  Conjectures 'R Us  1663  20200706 20:15 
Bases 251500 reservations/statuses/primes  gd_barnes  Conjectures 'R Us  2111  20200706 12:06 
Bases 5011030 reservations/statuses/primes  KEP  Conjectures 'R Us  3683  20200705 08:12 
Riesel base 3 reservations/statuses/primes  KEP  Conjectures 'R Us  1039  20200620 16:35 
Bases 432 reservations/statuses/primes  gd_barnes  Conjectures 'R Us  1405  20200404 00:24 