mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Conjectures 'R Us (https://www.mersenneforum.org/forumdisplay.php?f=81)
-   -   Sieving Drive all base 2/4 k's worked by CRUS (https://www.mersenneforum.org/showthread.php?t=18608)

gd_barnes 2013-09-20 01:20

Sieving Drive all base 2/4 k's worked by CRUS
 
[COLOR=black][FONT=Verdana]This is a sieving drive for all 16 k's for both sides of bases 2 & 4 that are currently being worked on by CRUS. Included in the drive are 11 k's for base 2 even & odd n, 4 k's for base 2 even k, and 1 k for base 4. (The remainder of base 4 k's are effectively included in base 2 even/odd n or are being worked by other projects.) I have spent the last several weeks sieving all 17 k's for the entire range of n=1M to 16777216 to P=100T as an extension to Jean Penne's sieve for the 12 k's for base 2 even/odd n and as a starting point for this effort.[/FONT][/COLOR]

The file contains 3 k's that have effectively already been sieved to a deep depth but for a smaller n-range by PrimeGrid. I am leaving them in there because we are sieving a much wider n-range and the cost to keep them in there is minimal. The file also contains all n=1M-2M, which we have already tested. I am leaving the range in there because we will be doing doublechecks in the future and the cost to keep it in there is minimal.

As we progress, we will determine testing break off points based on sieving and testing speed. Testing will be done in a new team drive. This will allow us to at least somewhat bridge the gap between our base 2/4 search depth and other project's depth.

Like before, sr2sieve is what we will use. It is recommended that you run 64-bit sr2sieve on a 64-bit machine. Let us know if you need the executable or more detailed instructions on using it. Here is an example of the command to execute at the command prompt:

sr2sieve -p 115e12 -P 125e12 -i sieve-riesel-sierp-even-odd-k-n.txt

The above would be if you were sieving P=115T-125T. The file is listed after the "-i" command and is the actual file name that is posted in a link below. Feel free to name it something shorter if you want or use the "srwork" older convention where you don't have to specify a file name.

When complete, you should have a factors.txt file. Just post the file here in this thread or if it is too big, please Email the file to me at:
gbarnes017 at gmail dot com

A P=1T range should take ~5-6 CPU days on a modern 64-bit machine. Please reserve ranges in multiples of P=1T and plan to reserve no more than ~10 days of work at a time. When making reservations, please post your estimated completion date. This can be seen in sr2sieve about one minute after you start your sieve.

Here is a link to the latest sieve file:
[URL]http://www.noprimeleftbehind.net/crus/sieve-riesel-sierp-even-odd-k-n.zip[/URL]

All factors up to P=760T have been removed. We will remove additional factors as the drive progresses to slightly speed up sieving.

Reservations:
[code]
P-range reserved by status est. completion date
100T-119T Lennart complete
119T-129T KEP complete
129T-140T Lennart complete
140T-144T KEP complete
144T-178T Lennart complete
178T-188T KEP complete
188T-410T Lennart complete
410T-415T TheCount complete
415T-420T Lennart complete
420T-507T gd_barnes complete
507T-580T Lennart complete
580T-582T mdettweiler complete
582T-760T Lennart complete
[/code]All help is greatly appreciated as we attempt to bring our bases 2 & 4 up to a level near other project's search depth! :smile:


Thank you,
Gary

c10ck3r 2013-09-20 05:06

[STRIKE]Yea, so, stupid question time...why are "we" (not myself, the 5T range would take almost a month on my laptop) including such small n values in this sieve. I have it on good authority (read: I ran LLR) that there are no primes for any of the n values below 25k, and likely much higher than that. Is there any merit for including these in a sieve, or could they be, say, removed by concurrent LLR testing?
[/STRIKE]

Someone should really teach me how to read.

[STRIKE]Sidebar: [/STRIKE] New main point: For anyone that's wondering on quick stats before they try it, running sr2sieve on 1 core of my i5-2450M @ 2.50 GHz laptop, the 5T range from 195-200T would take until Oct 17. There are abt. 1.3M candidates in the sieve file, and the above range has ~1063 expected factors.

gd_barnes 2013-09-20 06:31

[QUOTE=c10ck3r;353533][STRIKE]Yea, so, stupid question time...why are "we" (not myself, the 5T range would take almost a month on my laptop) including such small n values in this sieve. I have it on good authority (read: I ran LLR) that there are no primes for any of the n values below 25k, and likely much higher than that. Is there any merit for including these in a sieve, or could they be, say, removed by concurrent LLR testing?
[/STRIKE]

Someone should really teach me how to read.

[STRIKE]Sidebar: [/STRIKE] New main point: For anyone that's wondering on quick stats before they try it, running sr2sieve on 1 core of my i5-2450M @ 2.50 GHz laptop, the 5T range from 195-200T would take until Oct 17. There are abt. 1.3M candidates in the sieve file, and the above range has ~1063 expected factors.[/QUOTE]

I would recommend at least 4 cores for sieving 5T on this. It took me a little over 3 weeks on 24 cores (2.4-2.6 Ghz) to sieve 100T (70+ CPU weeks). Yep, it's a HUGE effort but it will pay off in the long run. On my one more modern 2.9 Ghz machine, it was processing about P=2.2M per second. That's about P=190G per day or about P=1T in 5 days. So you can figure on ~25 CPU days to sieve P=5T. 4 cores could do it in < 1 week.


Gary

Lennart 2013-09-20 18:20

Reserving 100T-110T

Lennart

Lennart 2013-09-20 19:40

Reserving 110T-113T

Lennart

Lennart 2013-09-20 19:55

Reserving 113T-119T

Lennart

gd_barnes 2013-09-20 20:43

After some thinking about c10ck3r's comments, I've decided that it's OK to make reservations as small as P=1T to allow our smaller searchers to participate in the sieving. I'd still like to ask that reservations be completed within ~10 days so that we can periodically remove a contiguous set of factors from the file for future sieving efficiency.

By my calculations, a P=1T range should take ~5 days on one core of a modern machine.

Lennart 2013-09-25 01:11

1 Attachment(s)
110T-119T complete

Lennart

Lennart 2013-09-30 02:18

1 Attachment(s)
100T-110T complete

Sorry I missed those.



Lennart

gd_barnes 2013-09-30 04:58

Thanks Lennart. A new file has been posted with all factors up to P=119T removed.

KEP 2013-10-05 16:31

Reserving:

119T-121T

ETA is ~9.5 Days

KEP 2013-10-13 18:57

Range complete
 
1 Attachment(s)
p=119T to p=121T is complete.

677 factors were found

For n<=2M I found 77 factors
For n>2M to n<=5M I found 128 factors
For n>5M I found 472 factors

The total time spent on sieving is:

2,000,000,000,000/1,080,000 p/sec = 1,851,851 seconds

For n<=2M I had an average time of 24,050.012 sec/factor
For n>2M to n<=5M I had an average time of 14,467.586 sec/factor
For n>5M I had an average time of 3,923.413 sec/factor
For entire range I had an average time of 2,735.378 sec/factor

Regards

KEP

KEP 2013-10-17 18:05

Reserving 121T-129T

3T on old Quad
5T on I5 64-bit

rogue 2013-10-17 20:18

If someone wants to have fun, they could update sr2sieve to support AVX instructions...

pepi37 2013-10-17 21:04

[QUOTE=rogue;356557]If someone wants to have fun, they could update sr2sieve to support AVX instructions...[/QUOTE]

That will be great.... and if get same speed increased as for LLR .. even better!

KEP 2013-10-18 13:34

[QUOTE=rogue;356557]If someone wants to have fun, they could update sr2sieve to support AVX instructions...[/QUOTE]

Is that something you're planning to do or is it meant as a joke :smile: ?

Lennart 2013-10-18 18:28

Reserving 129T-135T

Lennart

Lennart 2013-10-22 15:00

1 Attachment(s)
129T-135T Complete

Lennart

Lennart 2013-10-25 01:22

Reserving 135T-140T

Lennart

KEP 2013-10-28 10:07

1 Attachment(s)
121T-129T is complete!

Reserving 140T-144T

Lennart 2013-10-29 00:28

1 Attachment(s)
135-140T Complete

Lennart

Lennart 2013-10-29 01:34

Reserving 144T-150T

Lennart

Lennart 2013-11-03 00:23

1 Attachment(s)
144T-150T complete



Reserving 150T-160T


Lennart

Lennart 2013-11-04 04:42

Reserving 160T-172T

Lennart

KEP 2013-11-06 21:50

1 Attachment(s)
140T-144T complete

Factors found is attached :)

Lennart 2013-11-09 07:34

1 Attachment(s)
150T-160T complete

Lennart

Lennart 2013-11-09 07:40

1 Attachment(s)
164T-172T complete

Lennart

Lennart 2013-11-11 06:46

1 Attachment(s)
160T-164T complete



Lennart

Lennart 2013-11-11 11:08

Reserving 172T-178T

Lennart

Lennart 2013-11-15 07:48

1 Attachment(s)
172T-178T Complete


Lennart

KEP 2013-11-16 11:33

Reserving 178T-188T ETA 10 days

Lennart 2013-11-19 09:45

Reserving 188T-202T

Lennart

Lennart 2013-11-19 15:30

Reserving 202T-210T

Lennart

Lennart 2013-11-23 20:26

1 Attachment(s)
188T-210T Complete

Lennart

KEP 2013-11-28 20:16

1 Attachment(s)
178T-188T is complete

gd_barnes 2013-11-29 05:42

Factors have been removed from the sieve file up to P=210T. :smile:

Jean Penné 2013-12-13 15:26

Congrats!
 
Congratulations for this new sieving effort, much more efficient that mine!
The 10 files I was still sieving reached now 100T, but I don't continue this way...
Nevertheless, I will continue to make them available, for reference, on my personal page :

[url]http://jpenne.free.fr/ConjRus/[/url]

Regards,
Jean

Lennart 2013-12-14 00:03

Reserving 210T-216T


Lennart

Lennart 2013-12-31 08:55

210T-216T Complete

Sorry I forgot this one :)


Reserveing 216T-222T

Lennart

Lennart 2013-12-31 20:58

1 Attachment(s)
Fogott the file

Lennart

Lennart 2014-01-05 17:45

1 Attachment(s)
216T-222T Complete

Reserving 222T-228T


Lennart

Lennart 2014-01-09 12:22

1 Attachment(s)
222T-228T Complete

Reserving 228T-234T


Lennart

Lennart 2014-01-16 22:58

1 Attachment(s)
228T-234T complete

Reserving 234T-240T

Lennart

Lennart 2014-01-21 22:15

1 Attachment(s)
234T-240T complete

Reserving 240T-246T

Lennart

Lennart 2014-01-25 17:33

1 Attachment(s)
240T-246T complete

Reserving 246T-252T

Lennart

Lennart 2014-01-29 16:02

1 Attachment(s)
246T-252T complete

Reserving 252T-258T

Lennart

Lennart 2014-02-02 14:42

1 Attachment(s)
252T-258T complete

Reserving 258T-264T

Lennart

Lennart 2014-02-06 15:19

1 Attachment(s)
258T-264T complete

328sec/factor @ 6 core

Reserving 264T-270T



Lennart

Lennart 2014-02-10 16:44

1 Attachment(s)
264T-270T complete

Reserving 270T-276T

359 sec/factor @ 6 core
Lennart

Lennart 2014-02-14 18:57

1 Attachment(s)
270T-276T complete

Reserving 276T-282T

353 sec/factor @ 6 core
Lennart

Lennart 2014-02-18 22:13

1 Attachment(s)
276T-282T complete

Reserving 282T-288T


Lennart

Lennart 2014-02-25 00:36

1 Attachment(s)
282T-288T complete

Reserving 288T-294T


Lennart

Lennart 2014-02-28 23:17

1 Attachment(s)
288T-294T complete

Reserving 294T-300T


Lennart

Lennart 2014-03-04 19:01

1 Attachment(s)
294T-300T complete

Reserving 300T-306T


Lennart

Lennart 2014-03-09 00:26

1 Attachment(s)
300T-306T complete

Reserving 306T-312T


Lennart

Lennart 2014-03-13 13:10

1 Attachment(s)
306T-312T complete

Reserving 312T-318T


Lennart

Lennart 2014-03-17 17:51

1 Attachment(s)
312T-318T complete

Reserving 318T-324T


Lennart

Lennart 2014-03-21 15:37

1 Attachment(s)
318T-324T complete

Reserving 324T-330T


Lennart

Lennart 2014-03-25 17:33

1 Attachment(s)
324T-330T complete

Reserving 330T-336T


Lennart

Lennart 2014-03-30 00:23

1 Attachment(s)
330T-336T complete

Reserving 336T-342T


Lennart

Lennart 2014-04-03 00:25

1 Attachment(s)
336T-342T complete

Reserving 342T-348T


Lennart

Lennart 2014-04-06 23:25

1 Attachment(s)
342T-348T complete

Reserving 348T-354T


Lennart

Lennart 2014-04-10 20:09

1 Attachment(s)
348T-354T complete

Reserving 354T-360T


Lennart

Lennart 2014-04-14 16:29

1 Attachment(s)
354T-360T complete

Reserving 360T-366T


Lennart

Lennart 2014-04-18 13:49

1 Attachment(s)
360T-366T complete

Reserving 366T-372T


Lennart

Lennart 2014-04-25 15:09

1 Attachment(s)
366T-372T complete

Reserving 372T-378T


Lennart

Lennart 2014-04-29 13:45

1 Attachment(s)
372T-378T complete

Reserving 378T-384T

It is about 1hr/factor now.

Lennart

gd_barnes 2014-04-29 20:42

[QUOTE=Lennart;372271]It is about 1hr/factor now.[/QUOTE]

Perhaps you can stop at P=400T and we'll attempt to do an optimum sieve depth calculation. I'd like to break off n=2M-5M for a PRPnet team drive. We may be more than enough sieved for that range by 400T.

Lennart 2014-04-30 14:33

Reserving 384T-392T


Lennart

Lennart 2014-05-03 10:51

1 Attachment(s)
378T-384T complete

Reserving 392T-398T


Lennart

Lennart 2014-05-09 18:38

1 Attachment(s)
388T-392T complete


Reserving 398T-400T


Lennart

gd_barnes 2014-05-09 22:23

I assume you're still working on 384T-388T?

It will be shortly time to do some optimum sieve depth calculations.

When you get a chance, can you run two tests and let me know your test time for both? One should be a candidate at n=~3M and the other a candidate at n=~3.8M. I'd like to look at this in two different ways and between us, we can figure out if we think we've sieved far enough. (We probably have.)

Also please provide your factor removal rate for your last file of 398T-400T.

Lennart 2014-05-10 00:14

Ok I will give you the factor rate. What type of computer do you like me to use ? with or without AVX ?.

Lennart

gd_barnes 2014-05-10 00:22

[QUOTE=Lennart;373102]Ok I will give you the factor rate. What type of computer do you like me to use ? with or without AVX ?.

Lennart[/QUOTE]

I don't know much about AVX. Regardless, here is my thought: Use the machine with the fastest siever for the sieving rate and the one that is the fastest primality tester for the testing rate, even if those are different machines.

Lennart 2014-05-10 14:03

1 Attachment(s)
384T-388T complete

p=387999869963633, 11614009 p/sec, 374 factors, 100.0% done,
sr2sieve 1.8.10 stopped: at p=388000000000000 because range is complete.
Found factors for 374 terms in 354062.767 sec. (expected about 410.49)


This is on 4 core.

Lennart

Lennart 2014-05-12 06:55

1 Attachment(s)
398T-400T complete

sr2sieve 1.8.10 stopped: at p=400000000000000 because range is complete.
Found factors for 189 terms in 203284.589 sec. (expected about 198.38)



Lennart

Lennart 2014-05-13 14:10

1 Attachment(s)
392T-398T complet

Found factors for 577 terms in 365189.777 sec.

On 6 core.



Lennart

gd_barnes 2014-05-13 20:38

Looks like we're complete to P=400T! Thank you very much Lennart. :-)

Can you run a couple of primality tests and let me know how long they take? One should be a candidate at n=~3M and the other a candidate at n=~3.8M. To make it quick, what I do is get the iteration rate after about a couple of minutes of testing and then extrapolate the rate out to a full test.

Lennart 2014-05-13 21:08

[QUOTE=gd_barnes;373387]Looks like we're complete to P=400T! Thank you very much Lennart. :-)

Can you run a couple of primality tests and let me know how long they take? One should be a candidate at n=~3M and the other a candidate at n=~3.8M. To make it quick, what I do is get the iteration rate after about a couple of minutes of testing and then extrapolate the rate out to a full test.[/QUOTE]

I am doing that now. 3M is about 5000 sec on the +1 side.
[2014-05-13 22:51:37 WEDT] Server: PSPfp, Candidate: 101746*2^3000264+1 Program: llr64.exe Residue: FAAA9DBA2026663F Time: 5042 seconds
Lennart

Lennart 2014-05-13 22:09

[2014-05-13 23:51:33 WEDT] Server: PSPfp, Candidate: 101746*2^3800072+1 Program: llr64.exe Residue: 6F3FBC968E0255C4 Time: 8627 seconds



Lennart

gd_barnes 2014-05-14 00:24

Lennart,

Thank you for the tests. Much to my surprise, it looks like we need quite a bit more sieving. (Possibly because your sievers are so fast.) What I did was extrapolate the factor removal rate over just the range that we want to break off and test:

At P=384T, the sieving rate was 11,614,009P/sec. and the expected number of factors for a P=4T range was 410.49. That amounts to 839 secs/expected fac. for the entire n=1-16777216 range.

Since n=2M-5M is only 17.88% (3M/16.77M) of the entire n-range of the file, that means we are removing factors from the n=2M-5M portion at a rate of 839/.1788 = 4692 secs/fac. Next we take the test time of a candidate at 60% of that n-range (i.e. n=3.8M) and you got 8627 secs.

Now we take the test time and divided it by the current factor removal rate to see how much further we need to sieve. So...8627 / 4692 = 1.84. Multiplying 1.84 by 384T gives 706T.

And finally we make downward adjustments for 2 reasons:
1. If the sieve file stayed the same size, the removal rate would move down in a linier fashion. So if the removal rate was 4000s/fac at P=400T, it would be 8000s/fac at P=800T. But the file does not stay the same size; factors are removed. I've generally found that when we are close to 1/2x of the optimum depth, subtracting around 5% is a good estimate.
2. We will find some primes so not all of the candidates will need to be tested. Since almost all of these k's are abnormally low weight, we'll make a rough estimate of 2-3%.

So...subtracting off 7-8% of 706T leaves us in the ballpark of P=650T. Therefore I think we need to sieve to P=650T before breaking off any testing. I will change the first post to reflect where we need to sieve to.

The reason that I requested the n=3M test is that I wanted to do this same calculation for n=0-5M as if we never tested this file at all. Interestingly the optimum sieve depth was very similar (~630T), which is what I had hoped for. (The similarity was surprising since your test times were so different.)

If your sievers are still available, continue firing away. :smile:


Gary

Lennart 2014-05-14 00:59

Reserving 400T-410T


Lennart

VBCurtis 2014-05-15 04:21

[QUOTE=gd_barnes;373408]Lennart,
At P=384T, the sieving rate was 11,614,009P/sec. and the expected number of factors for a P=4T range was 410.49. That amounts to 839 secs/expected fac. for the entire n=1-16777216 range.

Since n=2M-5M is only 17.88% (3M/16.77M) of the entire n-range of the file, that means we are removing factors from the n=2M-5M portion at a rate of 839/.1788 = 4692 secs/fac. Next we take the test time of a candidate at 60% of that n-range (i.e. n=3.8M) and you got 8627 secs.
Gary[/QUOTE]

Gary-
How is it relevant to divide factor time for the portion of the range you care about? I don't understand your methodology, and don't believe it is correct. By your logic, it would be correct to break off 3-3.5M at this time, because that's just 1/32nd of the candidates and 839*32 is >3x time per test. That's silly.

Since you will (eventually) continue the sieve from 5M-up, you want to calculate the extra time it takes to sieve 1-16M vs 5-16M, and divide that extra (marginal) time by the number of factors you'd find in 0-5M. This is the actual time per factor for the 0-5M region. As it happens, factors in 0-2M are only half as useful as 2-5M, since they only save you a double-check; you could adjust for that if you wish.

In my experience, marginal calculations lead us to sieve until the time per factor is double the time per test of the smallest candidates, and then break off a sizable (say, 0-3M in this case) chunk. But you wish to double-check your tests eventually, so each factor found saves you two tests except in 0-2M range.

gd_barnes 2014-05-15 10:14

I had a feeling you might chime in here. :smile: As you probably figured out, technically what we are doing here is sieving until the factor removal rate in our specific n-range is equal to the average testing time. I realize there is a slippery slope on a couple of things so I will explain:

First, there is no guarantee that we will double check in the future so I'm not taking that into account. Second, I would never suggest using this calculation where the break off n-range is such that nmax/nmin < 2. (We're doing n=2M-5M so it's 2.5 here). Otherwise you would "optimally" sieve to P=1K (or less, lol) for, say, n=2M to 2.01M in a file with an n-range of n=~16.77M. That is you would have sieved "enough" because there is such a teeny percentage of the candidates in the "break off" range. :confused:

I realize that the marginal rate that you have showed me and demonstrated to others in the past is technically more accurate in the [I]extreme[/I] long run, i.e. 5-10 years or more. (I think it will take us > 5 years to complete the testing of this entire file to n=16.77M.) The reason that I don't use it is that it would result in an astronomical optimum sieve depth if we were sieving a monsterous n-range. But I think we have to think along the lines of that testing/sieving software speeds have almost always changed dramatically over such long periods of time. This becomes a slippery slope in the opposite direction of the above example. Here, let's say we decided to sieve n=1 to 10^9 because technically it's more efficient in the extreme long run but we still wanted to begin testing at n=2M. We would end up sieving for years to get the marginal factor removal rate up to where it needed to be for breaking off n=2M-5M. For that matter, it might not even make sense to break off such a small n-range, which likely would result in an even longer initial sieving effort.

Lennart 2014-05-15 11:47

Just in case it is not done I have checked all k's in this file with LLR up to n<600K.

I did not find anything.

Today I start 600k-800k.

Lennart

gd_barnes 2014-05-15 22:06

Excellent. Thanks Lennart.

Lennart 2014-05-16 15:00

600k-800k done nothing found


Starting 800K-1M


Lennart

VBCurtis 2014-05-16 23:20

Gary-
The fact that breaking off small pieces leads to comical depth estimates should be the only indication you need that your method doesn't make sense. My method is solely concerned with the shortest project length for sieving plus testing all candidates with LLR. That may not be the important metric, as you indicate by worrying about software/tech changes as time marches on. If those worries play a large role, you have chosen too large a sieve to start with- but you did so, so let's look at options right now.

Lennart's sieve speed should increase by 3-4% by using n-min of 1M instead of 1. Since these double-checks will be finished this week, there's no point to leave those in the sieve! I would use n-min of 2M, since the 1-2M range is oversieved for double-checks already.

However, the best plan would be to break off 2-3M whenever you feel like it, and continue to sieve 3-16M. When the LLR folks reach 3M, break off another 1M, etc. This assumes sieving will continue while others are LLRing; if that's not that case, I suppose the personal preference of the siever is the deciding factor. The optimal depth for 2-3M is still well higher than you are already (I estimate 2500T), but this number is so high because you built a very efficient sieve by choosing such an enormous N-range. If you had set out with a 2-5M sieve only, the optimal depth would be something like 1000T, and you'd be done already.

I don't think "optimal" matters in this case, though:
If optimal depth is 2500T, and you begin LLR at 500T instead, you can estimate the time "wasted" by LLRing instead of sieving by estimating the number of factors you would have found in the sieve and multiplying by the time difference between LLR test and marginal factor rate (usually half the average factor removal rate). You'd find factors for 5% or so of the file by going to 2500T instead of 500T, so the total time to test 2-5M might increase by 3% by starting now. You're willingly wasting 6% of sieve time already by keeping 0-2M in the sieve, so a "waste" of 3% of LLR time seems a small price to pay to get started on LLR now rather than a year from now!

tl;dr version: 'optimal' is about saving single-digit percentages of project time. Personal joy outweighs 3% time savings, so who cares about optimal?

Lennart 2014-05-18 08:13

[QUOTE=Lennart;373644]600k-800k done nothing found


Starting 800K-1M


Lennart[/QUOTE]


All test up to n<1M are done .. Nothing found


Lennart

gd_barnes 2014-05-18 08:55

It may not make much sense to you but I've always felt that it makes a lot of sense to sieve a large n-range to the optimum depth of sieving a small (nmax/nmin > 2) range so that the testing part of the effort doesn't have to take so long to get off the ground and we still get long term sieving efficiency gains. (I'm not sure what optimum would be here for only an n=2M-5M file but my guess is that it would be much less than the P=650T that we are doing here.) That is, why sieve n=2M-5M as it resides in a n=0-16.77M file to a greater depth then what the optimum depth would be for an n=2M-5M stand alone file? After all, these sieve ranges that are chosen are mostly arbitrary anyway. I just used the range originally chosen by Jean Penne for the base 2 even-n/odd-n k's. You'll have to admit that sieving this large n-range is much more efficient in the long run than sieving only n=2M-5M and then starting a new effort for n=5M-10M, etc., even if we're not sieving it to what the optimum would be over a 5-10 year sieving/testing effort for n<16.77M.

Breaking off the file in n=1M increments is too much hassle when you're running a project with 2000+ bases and multiple team drives.

Agreed that we could remove n=0M-1M now that Lennart has double-checked that range. Thanks again Lennart!

KEP 2014-05-18 12:55

A little off topic, however still worth mentioning:

On my SR383 search, I have been sieving n=1 to n=50M on all initially 52 k's remaining. The reason I ended up choosing to sieve from n=1 to n=50M, in stead of sieving from n=100K to n=200K is as follows:

Your sievespeed for the entire range of n=1 to n=50M is reduced to about 1/10th compared to your sievespeed for n=100K to n=200K. However your search range is also 500 times larger. So overall even though you are sieving 10 times slower, you are still 50 times faster at your overall sieving, since you wont have to sieve 50 ranges of 100K n's.

On a final note, completely off topic, I'm going to give a status on SR383 in about 2-3 weeks, as my doublecheck for SR383 for n<=100K is complete.

Take care

Kenneth

VBCurtis 2014-05-18 17:08

[QUOTE=gd_barnes;373750] (I'm not sure what optimum would be here for only an n=2M-5M file but my guess is that it would be much less than the P=650T that we are doing here.) That is, why sieve n=2M-5M as it resides in a n=0-16.77M file to a greater depth then what the optimum depth would be for an n=2M-5M stand alone file?
[/QUOTE]

I didn't pull 1000T out of thin air; I believe that would have been optimal for sieving 2-5M on its own. By sieving a larger file to a lower depth, you are trading sieve efficiency for LLR efficiency. Which saves more time: a sieve 50% more efficient, or LLRing 2% fewer numbers? Hint: sieve time is roughly 5% of LLR time.

As it turns out, that tradeoff is nearly even, so either choice is fine- but to think the sieve efficiency is such a big deal is an illusion, one which slows the onset of finding primes.

Lennart 2014-05-21 14:06

1 Attachment(s)
Complete 400T-410T

I will take a break now, I have some DC work todo.

Lennart

TheCount 2014-06-03 11:39

Reserving 410T-415T

5 cores:
sr2sieve-x86_64-windows.exe -p 410e12 -P 411e12 -i sieve-riesel-sierp-even-odd-k-n.txt
sr2sieve-x86_64-windows.exe -p 411e12 -P 412e12 -i sieve-riesel-sierp-even-odd-k-n.txt
sr2sieve-x86_64-windows.exe -p 412e12 -P 413e12 -i sieve-riesel-sierp-even-odd-k-n.txt
sr2sieve-x86_64-windows.exe -p 413e12 -P 414e12 -i sieve-riesel-sierp-even-odd-k-n.txt
sr2sieve-x86_64-windows.exe -p 414e12 -P 415e12 -i sieve-riesel-sierp-even-odd-k-n.txt


TheCount

TheCount 2014-06-03 12:58

[QUOTE=TheCount;374929]Reserving 410T-415T[/QUOTE] [LEFT][FONT=Arial][SIZE=2][FONT=Arial][SIZE=2]2.08M p/sec[/SIZE][/FONT][/SIZE][/FONT][/LEFT]
[LEFT][FONT=Arial][SIZE=2][FONT=Arial][SIZE=2]ETA 09 Jun 08:49 (WST)[/SIZE][/FONT][/SIZE][/FONT][/LEFT]
[FONT=Arial][SIZE=2][FONT=Arial][SIZE=2]
[LEFT]TheCount[/LEFT]
[/SIZE][/FONT][/SIZE][/FONT]

Lennart 2014-06-03 19:23

Reserving 415T-420T

Lennart

Lennart 2014-06-07 09:14

1 Attachment(s)
415T-420T Complete

Lennart


All times are UTC. The time now is 10:00.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.