mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   3*2^n-1 Search (https://www.mersenneforum.org/forumdisplay.php?f=14)
-   -   What is the status of the search and another link (https://www.mersenneforum.org/showthread.php?t=503)

EDillio 2003-04-16 15:11

Paul my email is not working to good today for some reason, everything is running ok so far :?

paulunderwood 2003-04-17 20:16

Thanks for helping out Ed :D

All blocks for 'n' below 355,000 have been taken. There are blocks available up 400,000 for anyone to test. Anything higher is in the sieve which should be up to divsors of 2.4 trillion by the end of the weekend.

Xyzzy 2003-04-19 21:40

I just ran a block on my 1.5GHz Celery... It took around 5 days... It was very well behaved and put a minimal memory hit on my computer...

Paul is very easy to work with... If I had a real computer and not a laptop I would seriously consider running this full time...

I try to run through at least one "block" of whatever we have added to the "other projects" forums just to make sure there aren't any weird issues to deal with...

Anyways, it was fun and educational! Thanks Paul! (YGM!)

paulunderwood 2003-04-20 09:50

Thanks for your help Mike. :)

I have sieved up to 2.4 trillion -- it was eliminating a candidate on average every 2413 seconds. I will immediately do a further 1.5 trillion on the 400,000-1,000,000 range and then review the sieve timings.

Here are some LLR timings on my 1GHz Athlon:

3*2^400011-1 Time : 2364.714 sec.
3*2^500006-1 Time : 3861.034 sec.
3*2^600023-1 Time : 5595.876 sec.
3*2^700003-1 Time : 11707.696 sec.
3*2^800023-1 Time : 13288.771 sec.
3*2^900000-1 Time : 18232.217 sec.
3*2^999988-1 Time : 23783.626 sec.

What do you consider is the ideal cut-off time for the sieve :question:

TTn 2003-04-20 12:50

Sieving ideas
 
I thought it might be nice to start a thread for sieving tips, to new users.

I'll start with my amateur techniques.
First start with an estimated time for the project at hand,

A. First I start a newpen file ( not sure of the optimum size of exponent)
I like to search a run larger than 3000.

B. Then, after a short time, stop the file and test my particular computer with the largest/last number in the file.( a couple of tests is nice but not practical with larger exponents)

C. Record seconds it takes to test with LLR. **important as computers vary extremely**
I never sieve longer than recorded length, and sometimes only 80% of recorded length, depending on the size of the file.

D. Larger files can be broken up into smaller ones, and re-sieved not exceeding recorded length. (by choosing option sieve until rate of k/n is __ seconds.) I THINK this helps in theory since, larger files exclude many composites, but then become cumbersome due to the bias of the recorded length. In theory one could continue breaking up the file from a larger one. We need some kind of derivative equation???

E. Fixed n searches, can be improved by excluding prime k, as they are not as likely to produce a prime.


Please correct me if I am wrong anywhere here!
Please suggest any tips you have picked up along the way.....

Shane F. :) :D :D ;)

paulunderwood 2003-04-20 15:40

[quote]First start with an estimated time for the project at hand,
[/quote]

I'd say 13-14 GHz years :(

[quote]I like to search a run larger than 3000, and smaller than 100000. [/quote]

We are going up to 1,000,000 -- It takes some 1,000 times as much time to find a prime that is 10 times longer.

[quote]Larger files can be broken up into smaller ones, and re-sieved not exceeding recorded length[/quote]

The help in NewPGen says:

To get the best performance, it is recommended that you use a wide range of n values (at least 3,000), as then NewPGen can use an algorithm that has a speed that is proportional to the square root of the range of n values being sieved (so a range that is 4 times as wide only takes twice as long to sieve).

[quote]Fixed n searches, can be improved by excluding prime k, as they are not as likely to produce a prime.
[/quote]

I can't agree with you on this unless you give some analysis. Remember you have different prime densities with different 'k' -- sieving and LLR testing times vary.

So my question remains:
[quote]What do you consider is the ideal cut-off time for the sieve ? [/quote]

Lastly: Shane, join us ;)

TTn 2003-04-22 11:57

It has been expained to me like this:

"If p divides k, then p does not divide k*2^n-1. This means that N=k*2^n-1 is prime with probability k/phi(k) * 1/ln(N) instead of probability 1/ln(N).

For k<100 with k*2^n-1 prime, this moves the probability that k is prime from 1 in 4 to 1 in 7; for k<1000, the probability of k prime moves from 1 in 6 to 1 in 11; for k<10000, the probability moves from 1 in 8 to 1 in 16."


I understand that especially highly composite k, eliminate those possible factors of (k*2^n-1). But maybe this is a catch-22 depending on the particular k in question. ?


I would like to know the reason why three was chosen for k in this project.
Shouldn't we look for the shortest distance, to find the most primes per CPU cycle. Surely this is not ? ? ?

I may join with one of my computers for the hell of it though, as I have 15 Ghz at my disposal now.

So you see that k=3 is not likely to be practical for such a search.
I am confident however that k has a frequent form( a shortest distance if you will), and would make for the ultimate collaborative effort to rival that of George's W.I.M.P.S!

TTn 2003-04-22 12:58

:D

smh 2003-04-22 15:46

Why break up the file?

It's [b]much[/b] more efficient to sieve the whole range at once since sieving time is proportional to the square root of the range.

Sieving a range 4 times longer will cost you only twice the time to sieve (and thus you will be removing more candidates in a given time)

TTn 2003-04-22 22:17

I see, maybe the file should have started bigger than the original intention. There is a cutoff point somewhere. If you spend ~23783 sec per exponent on the entire file, then when it grabs a composite at this rate in the 400011-500006 range of this file, it cancels [u]some[/u] of that efficiency mentioned. Since it should only take ~3000 seconds max, which is ~10 times faster, rather than the 1/2 ratio.

This is a great question!
:rolleyes: :surprised:ops:

paulunderwood 2003-04-23 00:31

:( thanks for joining our search, Shane.

There are 17 blocks being tested 8)

5 blocks are available for 'n' below 400,000

At 'p' equals 3.5 trillion, NewPGen is eliminating a candidate every 47:15 minutes ( 2835 seconds ).


All times are UTC. The time now is 14:34.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.