mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > Conjectures 'R Us

Reply
 
Thread Tools
Old 2014-12-27, 09:15   #320
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

100111101001102 Posts
Default

Quote:
Originally Posted by rebirther View Post
Reserving S185 to n=1M for BOINC
Be sure and make a note to stop this one if a prime is found. The tests will get extremely long.

This is the type of effort that benefits the most from BOINC.
gd_barnes is offline   Reply With Quote
Old 2014-12-27, 09:34   #321
Puzzle-Peter
 
Puzzle-Peter's Avatar
 
Jun 2009

3·223 Posts
Default

Quote:
Originally Posted by gd_barnes View Post
This is the type of effort that benefits the most from BOINC.
Either this or maybe also stuff like R63, n>25k which no individual will take on on their own.
Puzzle-Peter is offline   Reply With Quote
Old 2014-12-28, 03:19   #322
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

2·52·7·29 Posts
Default

Discussion about Riesel base 3 suggestions, sieving, and testing moved to that thread.
gd_barnes is offline   Reply With Quote
Old 2015-01-17, 15:13   #323
Neo
 
Neo's Avatar
 
Dec 2010
Ava, Missouri

23×5 Posts
Default

Reserving R138 (8K's remaining)

Sieving for N 100-250K to 500e9

Neo
AtP
Neo is offline   Reply With Quote
Old 2015-01-17, 23:13   #324
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

2·52·7·29 Posts
Default

Quote:
Originally Posted by Neo View Post
Reserving R138 (8K's remaining)

Sieving for N 100-250K to 500e9

Neo
AtP
Will you be testing or just sieving? You might want to consider a deeper sieve if testing.
gd_barnes is offline   Reply With Quote
Old 2015-01-18, 12:23   #325
Neo
 
Neo's Avatar
 
Dec 2010
Ava, Missouri

23×5 Posts
Default

Quote:
Originally Posted by gd_barnes View Post
Will you be testing or just sieving? You might want to consider a deeper sieve if testing.
Gary,

It's my intention to LLR.

I'm 84% done bringing the sieve to 500e9 and itching, just itching, to start LLR'ing.. I'm at a 6 seconds/factor removal rate. At least that's what Sr2sieve is reporting. I guess my hope was to hit a prime or two relatively early in the search and then continue sieving once tests started to get longer. I was thinking that Sr2sieve would run faster if it had a few less K's to sieve for, especially because I'm sieving for 8 K's over a wide range of N (100-250K)

I will take your advice though. To what depth should I take it? I am only using one core to sieve; are there any drawbacks/warnings against using the "-t" switch to add some cores?

Neo
Neo is offline   Reply With Quote
Old 2015-01-18, 12:50   #326
KEP
Quasi Admin Thing
 
KEP's Avatar
 
May 2005

3×5×61 Posts
Default

Quote:
Originally Posted by Neo View Post
Gary,

It's my intention to LLR.

I'm 84% done bringing the sieve to 500e9 and itching, just itching, to start LLR'ing.. I'm at a 6 seconds/factor removal rate. At least that's what Sr2sieve is reporting. I guess my hope was to hit a prime or two relatively early in the search and then continue sieving once tests started to get longer. I was thinking that Sr2sieve would run faster if it had a few less K's to sieve for, especially because I'm sieving for 8 K's over a wide range of N (100-250K)

I will take your advice though. To what depth should I take it? I am only using one core to sieve; are there any drawbacks/warnings against using the "-t" switch to add some cores?

Neo
An optimal sievedepth using 70% optimal depth, will be approximately:

950 second per test / 6 seconds * 420G = 158.33 * 420G = 66.5 T, you might wanna test the testing time for n=250K for the highest k. However this will be very close to your optimal sievedepth (based on my experienced assumptions, on how long a 1.23M bit test takes).

If you choose in the future to do your own calculations of optimal sievedepth, then this is a pretty good way to calculate the optimal sievedepth.

And yes, you're right, sr2sieve will run faster if a k is removed from the sievefile, but you still have to sieve untill you at least hit the minimum time an LLR test takes at n=100K, else you might not gain as much progress as you desire.

In regards to -t, I honestly have no answer to you, since I never uses the -t function, so someone else has to chime in on this

KEP
KEP is offline   Reply With Quote
Old 2015-01-18, 13:24   #327
Puzzle-Peter
 
Puzzle-Peter's Avatar
 
Jun 2009

3×223 Posts
Default

Quote:
Originally Posted by KEP View Post
In regards to -t, I honestly have no answer to you, since I never uses the -t function, so someone else has to chime in on this
KEP
From my personal experience you get the maximum performance by adding cores "by hand" i.e. starting several instances of sr2sieve, each searching its own range. If you want to save on the manual labor, I found that up to -t4 the performance drawback is tolerable. For more cores I tend to divide the range I'm sieving. This has been evaluated quite some time back and might not be very accurate any more.
Puzzle-Peter is offline   Reply With Quote
Old 2015-01-18, 15:35   #328
Neo
 
Neo's Avatar
 
Dec 2010
Ava, Missouri

23·5 Posts
Default

Quote:
Originally Posted by KEP View Post
An optimal sievedepth using 70% optimal depth, will be approximately:

950 second per test / 6 seconds * 420G = 158.33 * 420G = 66.5 T, you might wanna test the testing time for n=250K for the highest k. However this will be very close to your optimal sievedepth (based on my experienced assumptions, on how long a 1.23M bit test takes).
Thanks KEP and Puzzle Peter for your insights. :)

I ran an LLR on 372*138^100000-1 ... testing time was 210 seconds.
I ran an LLR on 1742*138^250000-1 ... testing time was 1,460 seconds.

SO, once the sieve is finished at 500G, and using the above formula:
210 / 6 = 35 * 500,000,000,000 = 17.5T ???

Second question for you guys:

I'm almost done (97% to 500e9) on the sieve.
I've found 12,755 factors.

Is there a benefit to using srfile to remove the composite K's (factors.txt) from the .abcd sieve file?
Will the removal of the 12,755 K candidates speed up sr2sieve?

If so, what command line do I use to remove composite K's from the abcd file while preserving the abcd file for further sieving?
(Edited) srfile -k --known-factors factors.txt ?

I thank you advance for your assistance. There are tons of threads and messages dating back to 2009... it's hard to keep all this information in my brain, but I have honestly tried hard by re-reading all the sr README's, threads, etc., ;)

Last fiddled with by Neo on 2015-01-18 at 16:01
Neo is offline   Reply With Quote
Old 2015-01-18, 16:23   #329
Lennart
 
Lennart's Avatar
 
"Lennart"
Jun 2007

25×5×7 Posts
Default

srfile -k factors.txt sr_138.abcd --G

-G if you like to have a prpfile
-a if you like to have a abcd file
Lennart
Lennart is offline   Reply With Quote
Old 2015-01-18, 16:26   #330
KEP
Quasi Admin Thing
 
KEP's Avatar
 
May 2005

3·5·61 Posts
Default

Quote:
Originally Posted by Neo View Post
Thanks KEP and Puzzle Peter for your insights. :)

I ran an LLR on 372*138^100000-1 ... testing time was 210 seconds.
I ran an LLR on 1742*138^250000-1 ... testing time was 1,460 seconds.

SO, once the sieve is finished at 500G, and using the above formula:
210 / 6 = 35 * 500,000,000,000 = 17.5T ???

Second question for you guys:

I'm almost done (97% to 500e9) on the sieve.
I've found 12,755 factors.

Is there a benefit to using srfile to remove the composite K's (factors.txt) from the .abcd sieve file?
Will the removal of the 12,755 K candidates speed up sr2sieve?

If so, what command line do I use to remove composite K's from the abcd file while preserving the abcd file for further sieving?
(Edited) srfile -k --known-factors factors.txt ?

I thank you advance for your assistance. There are tons of threads and messages dating back to 2009... it's hard to keep all this information in my brain, but I have honestly tried hard by re-reading all the sr README's, threads, etc., ;)
1. Your optimal sievedepth is correct for 100% sieve for n=100K (but you can most likely sieve to 12.25T for 70% optimal sievedepth)

2. You will most definently benefit from using "srfile -k factors.txt srsieve.out" and then "srfile -a srsieve.out" since removing factors will be speeding up your sieving. There is no use, for CRUS and other primesearching projects, to keep finding factors for k's already been proven composite as result of sieving, since we don't need factors, only primes.

My own addition:

optimal sievedepth for the entire range n>100K to n<=250K is 1460 / 6 * 500G = 243,33 * 500 = 121.66T (85.17T for 70% sievedepth)

Please notice, that 70% is in many instances a desired sievedepth and can due to the removal of candidates from primed k's, to some extent be justified as optimal sievedepth, for the kind of searching that CRUS does
KEP is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bases 33-100 reservations/statuses/primes Siemelink Conjectures 'R Us 1663 2020-07-06 20:15
Bases 251-500 reservations/statuses/primes gd_barnes Conjectures 'R Us 2111 2020-07-06 12:06
Bases 501-1030 reservations/statuses/primes KEP Conjectures 'R Us 3683 2020-07-05 08:12
Riesel base 3 reservations/statuses/primes KEP Conjectures 'R Us 1039 2020-06-20 16:35
Bases 4-32 reservations/statuses/primes gd_barnes Conjectures 'R Us 1405 2020-04-04 00:24

All times are UTC. The time now is 08:36.

Wed Jul 8 08:36:13 UTC 2020 up 105 days, 6:09, 0 users, load averages: 1.82, 1.71, 1.82

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.