-   3*2^n-1 Search (
-   -   What is the status of the search and another link (

garo 2003-04-08 04:05

What is the status of the search and another link
Since the Announcement mesg was locked I thought I'd start this new post.

LLR can also be found at
for those of you who are not Yahoo members and do not want the hassle of adding onto a new group.

Paul, it would be a good idea if you gave us a simple explanation of what you are trying to achieve, what the steps are - eg what sieving etc. and what the current status is. I know you answered several of these questions in your original post - which I moved to the Psearch forum - Will/Mike can you move that post here? - But it would be a good idea to restate all of that - prefereable in the announcement thread.

paulunderwood 2003-04-08 04:30

Re: What is the status of the search and another link
We are building on work done on primes of the form k*2^n-1 by Wilfred Keller et al:

Therefore the known values of 'n' that make 3*2^n-1 prime are: 1, 2, 3, 4, 6, 7, 11, 18, 34, 38, 43, 55, 64, 76, 94, 103, 143, 206, 216, 306, 324, 391, 458, 470, 827, 1274, 3276, 4204, 5134, 7559, 12676, 14898, 18123, 18819, 25690, 26459, 41628, 51387, 71783, 80330, 85687, 88171, 97063, 123630, 155930, 164987

The domain for 'n' in this search is initially 191600-1000000. I started sieving with Paul Jobling's NewPGen on the 2nd of this month using a 1GHz Athlon. Eleven blocks of work (191600-300000) became available on the 5th. Of those, one was completed in a couple of days using a P4 and LLR. Three of us are currently processing five blocks. This leaves five blocks and already the sieve is approaching the point where I release some more for testing.

Unlike Mersenne primes these numbers don't have to have a prime binary length. They look like this in binary:


ET_ 2003-04-08 07:12

I have a 1 GHz Athlon available and not connected to the Internet for a while.

Which range(s) may I try to help you?


paulunderwood 2003-04-08 23:25

Luigi, I have mailed you a block.

We are now five people with over 10 GHz of computers and there are three blocks available for anyone to test -- 'n' less than 300,000.

I am sieving 'n' ( 300,000 - 1,000,000 ). I just checked NewPGen and it is knocking out one candidate every 4:12 minutes. The target is over 5 minutes for the release of blocks from 300,000 to 400,000 for testing. NewPGen's 'p', the current divisor being checked, is 366.2 billion.

garo 2003-04-09 01:10

Regards sieving have you looked at the discussions on the Seventeen or Bust forum site. They have some very interesting ideas and the efficiency of the sievers has gone up about 100 times since Newpgen. Paul Jobling who wrote newpgen wrote a specialized version of newpgen called sobsieve and it was much faster. Then Phil Carmody also wrote nbegone which is works for non-x86 computers as well. They also had some discussions about the algorithms behind the sieving as well.

[Edit: Corrections to programs attributed to different people.]

smh 2003-04-09 07:45

[quote]They have some very interesting ideas and the efficiency of the sievers has gone up about 100 times since Newpgen. Phil Carmody who wrote newpgen - I think - wrote a specialized version of newpgen called nbegone and it was much faster. Then Paul jobling also wrote sobsieve which is even faster.[/quote]

Newpgen was written by Paul Jobling, and many of the optimizations made for SoBsieve are now also in Newpgen. Also, a big part of the 'efficiency' is due to the fact that 12 K values are sieved at the same time, which is faster then sieving 12 K's seperately.


If you are planning to search higher then N=1M, it's much more efficient to sieve the complete N range at once since sieving is proportional to the square root of the range.

paulunderwood 2003-04-09 18:46

When to stop sieving and start using LLR, PRP or PFGW is difficult to establish. There are many things to take into account:-
* it is nice to have some non-sieve testing happening on smaller 'n'
* the time take to test '2*n' is just over four times of that for 'n'
* sieving is proportional to the square root of the range
* judging NewPGen -- elimination timing goes up and down
* LLR/PRP/PFGW timings vary but not exactly proportional to size
* different computer architectures
* demand

Originally, I was going to sieve in 100,000 'n' blocks. I have decided that the work involved to complete 'n' to 1 million is enough work to organise for now.

Of course Paul Jobling's NewPGen is to be attributed when submitting primes to the top 5000 primes. Phil Carmody helped Paul with some maths to get a fixed-k sieve running faster.

After my last message NewPGen dropped to 3:20 minutes, but now reads 5:48 minutes. So by rough calculations the block 300,000-400,000 can be released for testing soon. There are two blocks left from 200,000-300,000 free for testing ( email me ). How I break up 300,000-400,000 depends on feedback from participants, but 2 GHz weeks seems about the right size for a block.

paulunderwood 2003-04-12 00:31

We now number seven. I have just taken 300,000 to 400,000 out of the sieve leaving 42423 candidates in there -- it had sieved upto NewPGen's p=553,420,892,818. I timed a PrP test on my Athlon 1GHz for 3*2^47000-1 at 3,600 seconds or exactly a hour meaning that NewPGen's new cut off point for 'n' between 400,000 and 500,000 is 10 plus minutes. Meanwhile I have started to issue blocks of 5000 from 300,000 to 400,000; there are nineteen blocks left for anyone to help test -- please email me at to get a block.

[Edit: should be 3*2^470000-1]

smh 2003-04-12 11:25

[quote]I timed a PrP test on my Athlon 1GHz for 3*2^47000-1 at 3,600 seconds or exactly a hour meaning that NewPGen's new cut off point for 'n' between 400,000 and 500,000 is 10 plus minutes. [/quote]

I don't agree on this one, i think you should be sieving much deeper, close to 1 hour per candidate removal. Especially when lower tests are still available to assign. sieving form 400K-1M is only 10% slower then sieving from 500K-1M

paulunderwood 2003-04-12 18:24

:surprised:ops: Thanks for making this clearer to me smh. I have now upped the sieving throughput by using NewPGen's service: The sieving has been split across three Athlons (1 TB and 2 XP's ). Each is doing a range of 600,000,000,000 (0.6 trillion). I will have to wait for the slowest to finish before merging the files. I guess this is going to take a week, but it takes the maximum divisor from 0.6 trillion to 2.4 trillion. I will then review the rates of candidate elimination.

Meanwhile we have two new members and there are seventeen blocks available for team members to test.

paulunderwood 2003-04-14 01:48

I want people not to be off joining the search because of the mistake I made in sieving. I have lost about 120 Ghz hours in a project that is going to take maybe 120,000 GHz hours to complete. We still need your help. ;) So please write to me ( ) if you want to join in.

EDillio 2003-04-16 15:11

Paul my email is not working to good today for some reason, everything is running ok so far :?

paulunderwood 2003-04-17 20:16

Thanks for helping out Ed :D

All blocks for 'n' below 355,000 have been taken. There are blocks available up 400,000 for anyone to test. Anything higher is in the sieve which should be up to divsors of 2.4 trillion by the end of the weekend.

Xyzzy 2003-04-19 21:40

I just ran a block on my 1.5GHz Celery... It took around 5 days... It was very well behaved and put a minimal memory hit on my computer...

Paul is very easy to work with... If I had a real computer and not a laptop I would seriously consider running this full time...

I try to run through at least one "block" of whatever we have added to the "other projects" forums just to make sure there aren't any weird issues to deal with...

Anyways, it was fun and educational! Thanks Paul! (YGM!)

paulunderwood 2003-04-20 09:50

Thanks for your help Mike. :)

I have sieved up to 2.4 trillion -- it was eliminating a candidate on average every 2413 seconds. I will immediately do a further 1.5 trillion on the 400,000-1,000,000 range and then review the sieve timings.

Here are some LLR timings on my 1GHz Athlon:

3*2^400011-1 Time : 2364.714 sec.
3*2^500006-1 Time : 3861.034 sec.
3*2^600023-1 Time : 5595.876 sec.
3*2^700003-1 Time : 11707.696 sec.
3*2^800023-1 Time : 13288.771 sec.
3*2^900000-1 Time : 18232.217 sec.
3*2^999988-1 Time : 23783.626 sec.

What do you consider is the ideal cut-off time for the sieve :question:

TTn 2003-04-20 12:50

Sieving ideas
I thought it might be nice to start a thread for sieving tips, to new users.

I'll start with my amateur techniques.
First start with an estimated time for the project at hand,

A. First I start a newpen file ( not sure of the optimum size of exponent)
I like to search a run larger than 3000.

B. Then, after a short time, stop the file and test my particular computer with the largest/last number in the file.( a couple of tests is nice but not practical with larger exponents)

C. Record seconds it takes to test with LLR. **important as computers vary extremely**
I never sieve longer than recorded length, and sometimes only 80% of recorded length, depending on the size of the file.

D. Larger files can be broken up into smaller ones, and re-sieved not exceeding recorded length. (by choosing option sieve until rate of k/n is __ seconds.) I THINK this helps in theory since, larger files exclude many composites, but then become cumbersome due to the bias of the recorded length. In theory one could continue breaking up the file from a larger one. We need some kind of derivative equation???

E. Fixed n searches, can be improved by excluding prime k, as they are not as likely to produce a prime.

Please correct me if I am wrong anywhere here!
Please suggest any tips you have picked up along the way.....

Shane F. :) :D :D ;)

paulunderwood 2003-04-20 15:40

[quote]First start with an estimated time for the project at hand,

I'd say 13-14 GHz years :(

[quote]I like to search a run larger than 3000, and smaller than 100000. [/quote]

We are going up to 1,000,000 -- It takes some 1,000 times as much time to find a prime that is 10 times longer.

[quote]Larger files can be broken up into smaller ones, and re-sieved not exceeding recorded length[/quote]

The help in NewPGen says:

To get the best performance, it is recommended that you use a wide range of n values (at least 3,000), as then NewPGen can use an algorithm that has a speed that is proportional to the square root of the range of n values being sieved (so a range that is 4 times as wide only takes twice as long to sieve).

[quote]Fixed n searches, can be improved by excluding prime k, as they are not as likely to produce a prime.

I can't agree with you on this unless you give some analysis. Remember you have different prime densities with different 'k' -- sieving and LLR testing times vary.

So my question remains:
[quote]What do you consider is the ideal cut-off time for the sieve ? [/quote]

Lastly: Shane, join us ;)

TTn 2003-04-22 11:57

It has been expained to me like this:

"If p divides k, then p does not divide k*2^n-1. This means that N=k*2^n-1 is prime with probability k/phi(k) * 1/ln(N) instead of probability 1/ln(N).

For k<100 with k*2^n-1 prime, this moves the probability that k is prime from 1 in 4 to 1 in 7; for k<1000, the probability of k prime moves from 1 in 6 to 1 in 11; for k<10000, the probability moves from 1 in 8 to 1 in 16."

I understand that especially highly composite k, eliminate those possible factors of (k*2^n-1). But maybe this is a catch-22 depending on the particular k in question. ?

I would like to know the reason why three was chosen for k in this project.
Shouldn't we look for the shortest distance, to find the most primes per CPU cycle. Surely this is not ? ? ?

I may join with one of my computers for the hell of it though, as I have 15 Ghz at my disposal now.

So you see that k=3 is not likely to be practical for such a search.
I am confident however that k has a frequent form( a shortest distance if you will), and would make for the ultimate collaborative effort to rival that of George's W.I.M.P.S!

TTn 2003-04-22 12:58


smh 2003-04-22 15:46

Why break up the file?

It's [b]much[/b] more efficient to sieve the whole range at once since sieving time is proportional to the square root of the range.

Sieving a range 4 times longer will cost you only twice the time to sieve (and thus you will be removing more candidates in a given time)

TTn 2003-04-22 22:17

I see, maybe the file should have started bigger than the original intention. There is a cutoff point somewhere. If you spend ~23783 sec per exponent on the entire file, then when it grabs a composite at this rate in the 400011-500006 range of this file, it cancels [u]some[/u] of that efficiency mentioned. Since it should only take ~3000 seconds max, which is ~10 times faster, rather than the 1/2 ratio.

This is a great question!
:rolleyes: :surprised:ops:

paulunderwood 2003-04-23 00:31

:( thanks for joining our search, Shane.

There are 17 blocks being tested 8)

5 blocks are available for 'n' below 400,000

At 'p' equals 3.5 trillion, NewPGen is eliminating a candidate every 47:15 minutes ( 2835 seconds ).

TTn 2003-04-26 11:07

If no one opposes this statement:

"If p divides k, then p does not divide k*2^n-1. This means that N=k*2^n-1 is prime with probability k/phi(k) * 1/ln(N) instead of probability 1/ln(N).

Then I would declare 15k*2^n-1 a viable search for the future, due to the evidence of first 1001 k listed on the k*2^n-1 search. And also I have probed k<5000, n<5000 with precise results 15k produces the most primes! There are exceptions to the rule but so far this is the evidence for little k (odd).

Please counter this if you can! :question: :idea:

TTn 2003-04-26 12:03

I'd like to add that 3 is a great candidate to test.
I had previously doubted testing k=3, but I'm hip.

paulunderwood 2003-04-28 17:12

Yes, Shane, three is cool 8)

The sieving is up to 'p' equals 3.9 trillion. Unfortunately I had a power cut/outage and lost the rate at which my Athlon 1GHz was eliminating candidates. However, over about 160 hours it eliminated 184 candidates -- on average one every 52 minutes. I have set off the sieving computers on another 1.5 trillion.

I have had a word with Paul Jobling ( who wrote NewPGen, the sieve I am using ) and confirmed with him that I can increase sieving thoughput on my slow Athlon and two relatively quick AthlonXP's by about twenty per cent. This is achieved by putting the highest range of 'p' on the slowest computer and the other two lower ranges of 'p' on the quicker computers. As soon as the two XP's have finished I can stop the slow computer and perform the merging of the files.

All blocks upto 'n' equals 420,000 have been assigned. Please email me ( ) to join in the search. I feel a prime find is imminent... ;)

TTn 2003-04-30 05:49

8) I would still like to discuss this cut-off time. :?
At the point where you are no longer pulling composites = to the testing time of a given range, then stop newgpen, start LLR or PRP, test that range and delete that range from newgpen, and resieve file.

s = Number of sieved n, so far.
t = Rate of sieved n, in seconds.
r = Number of sieved n, of a given range so far.(min-max)

(s*t)/r= cut off NewGPen

The appropriate ranges are however the real question :question:
These probably aren't the best cut-off ranges, but what is?

/ 3*2^400011-1 Time : 2364.714 sec.
\ 3*2^500006-1 Time : 3861.034 sec. When (s*t)/r=3861 sec. cut.

3*2^600023-1 Time : 5595.876 sec.
3*2^700003-1 Time : 11707.696 sec.
3*2^800023-1 Time : 13288.771 sec.
3*2^900000-1 Time : 18232.217 sec.
3*2^999988-1 Time : 23783.626 sec.

(min)*1.618033988749...^r ?
:devil: :evil: :devil: :evil: :devil: :shock:

paulunderwood 2003-04-30 21:20

I am inclined to agree more with Sander ( smh ) than you on the method of sieving. It seems better to keep candidates in the sieve for much longer than it first appears, so much so that it maybe better to organise a bigger sieving effort than the current one :devil: .

( Firstly let me make the assumption that candidates a uniformly random over 'n'.)

Suppose we decrease the range to a quarter of the original; Here the time taken decreases only by 0.5 ( i.e the square root -- as explained in NewPGen help on fixed-k searches.).

As a second example, if we decrease the original range to a sixteenth of the original it will only take a quarter of the time.

In my last report about the sieving status of 'n' for 400,000 to 1,000,000 I said I had removed 184 candidates in 160 hours which is about 52:17 minutes. Had I only sieved 420,000 to 1,000,000 I would have decreased the time taken to sqrt(58/60)*160 which is about 157.5 hours but would have eliminated (58/60)*184 candidates i.e. about 178 candidates. In the extra time gained ( 160-157.5 = 2.5 hours ) we could have tested with LLR an extra 3.8 candidates ( based on earlier given timings .) Summarising, by leaving the candidates in the seive we can eliminate 184 candidates, but by doing the lower 2/60 by LLR and leaving the rest in the sieve we would eliminate 181.8 candidates; 2.2 fewer :shock:

On the bright side, I see sieving taking me another 4-6 weeks on my three Athlons. If anyone with experience of using NewPGen and has an AthlonXP ( or better ) or more to help me do some sieving please contact me ( I am sieving 'p' in blocks that take an AthlonXP 1 week to complete -- Saturday to Saturday.

TTn 2003-05-01 09:32

I assume random placement as well.
But now I strongly disagree, due to many trials over the past few days.
There is a dual benefit, by testing the first range when it is ripe, rather than rotten, frees up cycles for the main file you cut from, which is now smaller and more edible as well.
Any given range of n,(min/max) should not differ more than twice as long as the minimum time of that range. The main large file does not loose it's benefits, to become shorter as time allows after it has done it's job in a range so that differs not from Sander's idea.
The large file is the way to start.

I can't see how you justify wasting much more than 4728 seconds to pull a composite out of this first range? Why allow the sieve to spend up to 23,783 seconds to pull one out? Are you suggesting sieving until 23,783 second per n, for the entire file at once? :surprised:ops:

/ 3*2^400011-1 Time : 2364.714 sec. min
r :question:
\ ~ 3*2^55000-1 Time: 4728 sec.

" 9456 sec
" 18912 sec

3*2^999988-1 Time : 23783.626 sec.

paulunderwood 2003-05-01 10:02

I am sorry for not explaining when I will stop sieving. I plot the timings I gave earlier over 'n'. Then I calculate the average time by estimating where half the area falls. Then I add a little for good measure! Therefore my cut off time will be somewhere in the region of 3:30-4:30 hours.

3*2^800023-1 Time : 13288.771 sec.

TTn 2003-05-01 10:29

Check the rate of those n, being pulled between 400000-550000.
What if it was [i]consistenly[/i] pulling them no faster than 4728 sec from here?

paulunderwood 2003-05-01 12:03

Suppose we are eliminating a candidate from the range 400,000-550,000 every 4728 seconds.

Method 1. We leave all the numbers in the sieve which means we remove one candidate from the whole range every 4728/4 seconds = 1182 seconds.

Method 2. We remove the range 400,000-550,000 from the sieve and continue sieving the rest. The time taken to sieve 0.75 of a candidate is sqrt(3/4)*1182 seconds = 1023 seconds. Assuming the average time for LLR on the range 400,000-550,000 is 3800 seconds, with the extra 158 seconds saved in sieving we can test 158/3800 of a candidate i.e. 0.04 of a candidate. Overall in 1182 seconds we would have eliminated only 0.79 of a candidate.

paulunderwood 2003-05-06 07:11

Just a quick line to let you know how the sieving is progressing. All 'n' greater than 430,000 are being sieved. There are 37,318 'n' in the sieve. The greatest divisor tested is 5.4 trillion. In about 170 hours my 1 GHz Athlon eliminated 130 candidates -- about 78:28 minutes per elimination. The target for elimination on this computer is 42 candidates a week.

paulunderwood 2003-05-12 17:27

All candidate 'n' less than 445,000 have be assigned to our members for testing. There are 36,344 candidates left in the sieve which has reached a maximum divisor of 6.9 trillion. My 1 GHz Athlon eliminated 69 candidates in about 150 hours i.e. one every 2:10 hours. Remember the target is about one in 4:00 hours. My three Athlons are now sieving another 1.5 trillion range of divisors.

wblipp 2003-05-14 14:56

[quote="TTn"]If no one opposes this statement:

"If p divides k, then p does not divide k*2^n-1. This means that N=k*2^n-1 is prime with probability k/phi(k) * 1/ln(N) instead of probability 1/ln(N).[/quote]

You're looking for an adjustment to the probability of prime estimated from the Prime Number Theorem as 1/ln(x). That estimate applies to consecutive integers; in consecutive integers we know that 1/2 are divisible by 2, 1/3 of the remainder are divisible by 3, 1/5 of the remainder are divisible by 5, 1/7 of the remainder are divisible by 7 etc. You correctly note that divisibility is different for k. But it's different for other numbers, too. 3*2^k-1 is never divisible by 2, 7, 17, 31 and others. 1/4 are divisible by 5 and 1/18 are divisible by 19, but half of the numbers divisible by 19 are also divisible by 5.

What you need is the "Proth-Minus" Weight of k, analogous to Proth Weight. Proth Weight is the adjustment needed for k*2^n+1. Yves Gallot's [url=]paper on weights[/url] and Jack Brennen's [url=]Java calculator[/url] are places to start learning about Proth Weight. The concepts for k*2^n-1 should be identical.

From personal experience, the best way to estimate these weights is to use the follow the methods of Brennen and Gallot to calculate the approximate effect of large numbers of primes. I tried calculating the exact effect of small numbers of primes, and ended up with inferior estimates that gradually converge to the Brennan and Gallot estimates as the "small number" gets bigger.

TTn 2003-05-14 18:53

I've been studying the proth weights too, and will add some info to my website about it. Though there is some validity to my 15k search.

BTW my project has just found 465*2^248940-1 is prime!

paulunderwood 2003-05-18 06:38

A 1.4 trillion range of divisors has been completed in 5 days; My 1GHz Athlon removed 56 candidates -- no change over previous weeks rate of candidate elimination :evil: . Have set up another 2 trillion range of divisors which will take the maximum divisor tested to over 10 trillion.

There is no hurry. Let me sieve and sieve. And when I say "ok" then join in in force. Otherwise things are ticking over nicely on the participants' testing side of the search.

paulunderwood 2003-05-25 11:42

The maximum divisor in the sieve has reached 10.2 trillion on the range 455,000 to 1,000,000 with 34,940 candidates left. In 165 hours my 1 GHz Athlon computer eliminated 62 candidates -- about one every 2:40 hours. I have set four computers sieving a further 2.8 trillion range of divisors. I feel the sieving phase is going to be completed with in the next week or two after which I will have a chance of finding a prime by doing some testing myself.

Using the formulae based on the approximation that testing time quadruples for doubling of size in 'n', we can see the project's progress has reached ( assuming all tests given out have been completed ):

(455000^3-191600^3)/(1000000^3-191600^3) = 0.08778 :banana:

paulunderwood 2003-06-01 12:08

The fourth machine I used for sieving was a 2.4 GHz P4 and it behaved like a 1 GHz Athlon. Fortunately, Thomas Ritschel is helping me out with a fast Athlon and I have put the P4 on LLR testing 8)

We have reached a maximum divisor of 12.8 trillion. My 1 GHz Athlon eliminated 52 candidates in 165 hours -- one every 3:10 hours.

paulunderwood 2003-06-05 12:51

My 1 GHz Athlon eliminated 23 candidates in 114 hours -- one every 4:57 hours. We have now stopped sieving with the maximum divisor at 14.7 trillion. There are 32,283 candidates left over between 490,000 and 1,000,000.

I have tried to contact Edward Dillio who has seven ranges booked out for testing but he is very difficult to get hold of. I will give him two more weeks to get in touch and if there is no communication with him I will re-assign his ranges.

paulunderwood 2003-06-22 05:03

:rolleyes: Edward Dillio's ranges have been put back in the pool of those to be tested freeing up 9 times 5000 'n' blocks. Perhaps there is a prime or two in these ;)

paulunderwood 2003-06-30 19:46

Ed tells me his computers got fried by a lightening strike -- ouch! We have covered his ranges and the results are coming in, but no primes yet. Ed has taken out a new range. We are now at 553,333 +.

I suppose that the primes are going to be large :(

paulunderwood 2003-07-16 05:42

I have cut down the size of each range to 2,500 and they will be cut progressively as we test for larger prime numbers.

25,831 candidates are left available for testing -- 592,000-1,000,000

We are 18% done on this project. :(

paulunderwood 2003-08-26 00:07

This project is moving along nicely as we are about to start testing the primality of numbers with greater than 200,000 decimal digits. In terms of work, this is about 1/3 of the way to a million bits, the initial goal of this project. With the sieving being completely done, it only remains to test the remaining 21,433 candidates ( which can so easily be done with the wonderful client computer programs we use. ) I'd guess there are about 8-9GHz years of number crunching left to do and at the current rate we will complete in about 8 months -- with more help even sooner -- hint, hint. The kind people who have contributed to date can be found at:


paulunderwood 2003-12-22 22:03

Merry Xmas and a primeful New Year!
220,000 computers! Hoping for extra computing power for the project in the new year especially if we see a IBDWT LLR :whistle:

paulunderwood 2004-02-05 10:59

Wow. We are now testing numbers greater than [URL=]Mersenne 32[/URL] found by Slowinski and Gage using a Cray super computer back in 1992 :whistle:

paulunderwood 2004-04-16 19:36

I've been prompted by email into writing a word or two about the status of the project.

Since starting a search for prime numbers of the form 3*2^n-1 just over a year ago we have found five new ones. After the initial sieving with Paul Jobling's NewPGen we have used three programs to eliminate the remaining candidate prime numbers, PFGW,PRP and LLR. From the start LLR was the most popular client being 10% faster than the others on Intel Pentium 4 chips because of George Woltman's highly optimised sse2 instruction set assembler coding; and LLR is a deterministic test -- showing instantly whether a number is prime or not, not requiring further proof testing with PFGW. A few weeks ago the project was going to take another year to complete. Then Jean released an updated LLR which for the numbers we are testing is 400% faster. This encouraged new participants and what was a year's project is going to be done in several weeks -- we will finish the initial goal of testing "n" upto one million in a couple of weeks time. Why not grab a range now before the run out? You just might find a really big prime number with over a quarter of a million decimal digits. The next step, the seiving of "n" between one and two million, is under way with hope that we can test to two million by the end of 2006. With even more help we can bring that date forward. As the primes get rarer, we would still expect to find at least one mega-bit prime :love:

paulunderwood 2004-06-01 16:54

Well, we have nearly tested to n=1 million. Thomas Ritschel tells me that sieving for n to 2 million has reached 17 trillion. As soon as the last few ranges are taken I'll ask Thomas to send me the first 100k so we'll always have something to LLR. :grin:

Citrix 2004-06-09 17:23

Any plan to get a network server running, like 15 k has?

:cool: :cool: :cool:

paulunderwood 2004-06-09 17:43

I would but... I don't have a server to site it on. Does it require Windoze or linux? Also I'd like to see any bugs ironed out. Besides I can cope with our current way of reservation for the time being.

ET_ 2004-06-09 17:58

[QUOTE=paulunderwood]I would but... I don't have a server to site it on. Does it require Windoze or linux? Also I'd like to see any bugs ironed out. Besides I can cope with our current way of reservation for the time being.[/QUOTE]

I can ask for a permanent IP for my own internet connection.
I actually run Windows XP Pro, but can set up a linux box if needed. Obviously my PCs are on 24/7.
I also have some spare space on my web servers to handle a web site for it.

I know it's not like having your own server connected to Internet, but if no better solution appears, I'm here.


paulunderwood 2004-06-09 18:23

Thx ET_. I will consider it. In the mean time I will look at how 15k's LLR server works.

Where is the 15k server?

Lupine1647 2004-06-10 01:07

Hey Paul,
I can host it on my system, I already got the 12121 Search's LLRNET setup on here.
However, the system won't be online until after I get back from australia.

SlashDude 2004-06-10 17:24

[QUOTE=paulunderwood]Thx ET_. I will consider it. In the mean time I will look at how 15k's LLR server works.

Where is the 15k server?[/QUOTE]

I have the 15k llrnet server running in my basement. :whistle: (I do web hosting for some other stuff too.) Everything is on battery backups so I'm good for a few hours without power, unless I lose my connection from my ISP...

It is currently running it on a Windows 2k3 server, but I have issues when more then 1 person tries to connect to it at a time. I believe the Linux version of the server handles multiple users better, but I haven't had a chance to move the server. (Real-life takes over again...)

Let me know if you have any questions. Kosmaj has done some work on updating the scripts for the llrnet server to improve its reporting...


paulunderwood 2004-06-10 18:45

[QUOTE]I have the 15k llrnet server running in my basement.[/QUOTE]

LOL. What I really meant was: how do I point my computer at it :redface: I wanted to use it to see how it behaved and just maybe if I really liked it I would move over to using the llrnet server software for 321.

You know, some people reserve ranges via me using email -- could this be easily handled with llrnet?

SlashDude 2004-06-10 21:44

[QUOTE=paulunderwood]LOL. What I really meant was: how do I point my computer at it :redface: I wanted to use it to see how it behaved and just maybe if I really liked it I would move over to using the llrnet server software for 321.

You know, some people reserve ranges via me using email -- could this be easily handled with llrnet?[/QUOTE]

Here is the post with the instructions for setting up the LLRnet client to point to the 15k LLRnet server: [url][/url]

If port 7000 is blocked, we also have a server on port 443 [url][/url]

The new llr client sometimes has problems with save files (As you probably know :) but just to make sure, here is our fix: [url][/url]

Ok, one last thing :smile:
Kosmaj has updated some of the scripting code for the smaller n's. This really helps our project, but with the large n's, it isn't needed: [url][/url]

Hope that helps :)

paulunderwood 2004-06-12 18:10

Thanks. I will get around to looking at that later...

I have asked xyzzy to upload the files for our reservation system but he hasn't done it yet. :sleep: Anyone wishing to test some mega-bit numbers that have been sieved to 21 trillion can email me. :help: :help: :help:

paulunderwood 2004-06-12 19:17

Xyzzy has done his magic and the 321 [URL=]"1-2M reservation"[/URL] is now working. Good luck to all who participate or not :mellow:

paulunderwood 2004-07-06 14:40

We have nearly reserved blocks to 1,050,000. Over the the next day or two Xyzzy will upload the replacement files 1,050,000-1,250,000 -- this will keep us busy for a few months :smile:

A big thanks to Thomas Ritschel for spending 1.7 GHz years of Athlon sieving to get the minimum divisor to 37.5 trillion. :bow: It is now up to you to help us do the LLR tests on the remaining candidates as it has now become more efficient to do so.

paulunderwood 2004-07-07 07:20

:sleep: Until Xyzzy uploads the new ranges to the reservation system, you can email to get a range to test.

paulunderwood 2004-07-08 17:05

Thanks again to Xyzzy the [URL=]"reservation system"[/URL] is up again. :coffee:

Files are available for exponents upto 1.25 million bit numbers. These have been sieved to 37.5 trillion. A range will take a couple of days on a P4 and a week on an Athlon. Anyone finding a prime will have found a top 50 prime. Remember LLR is much, much faster for 321's k=3 than some other projects. Join us!

paulunderwood 2004-08-28 23:07

:coffee: xyzzy has uploaded the new files to the "reservation system" for LLR'ing which have been sieved upto 41 trillion :flex: These will take us to 1.5 million bits ( 450,000 digits ) :rolleyes: With a little luck there will be another prime found in this range. Good luck to all who test. :wink:

paulunderwood 2004-09-27 11:03

Just a little message to keep you updated with the search...

We are now starting to test numbers with over 400,000 digits -- any prime found will be in the top25 of known primes. :showoff:

Thomas and I have developed some utility scripts to produce the latest sieved files from all our newpgen del files and to check all the sieved values are correct. The whole process takes less than minute! If only we could check LLR results as quick :wink:

paulunderwood 2004-11-03 11:08

Find a 321 prime now and this will be the 17th largest known prime, bigger than [URL=]GIMPS first prime[/URL] :w00t:

Join up and help us reach 1.5 million bits by the end of the year. :help:

paulunderwood 2004-11-14 19:45


there is a new LLR2 ( LLR 2.2 ) available for Athlons and other non-sse2 computers available from [URL=]here[/URL] . It fixes a few minor bugs, none of which seems to affect 321search. :whistle:

Join 321search now and help us reach exponent 1.5m by the end of 2004 and maybe find a prime number. We have plenty of LLR testing to do over the next few years. A few of us are also sieving 321 numbers as big as 1.5 million digits. The next batch of numbers to be released will have been sieved to 70 trillion :showoff:

paulunderwood 2004-11-30 19:34

There is now a new llrp4 ( LLRP43 3.3 ) available from [URL=]here[/URL] (30/11/04). I have noticed at least a 10% speed gain. Thanks very much to Jean Penne for the new program :bow:

When installing, I recommend you stop LLR, exit it, delete any files beginning with 'z', install the new llr and restart with that.

Exciting times are ahead; We are now starting to test numbers bigger than the largest prime found by LLR. In fact our next prime will beat these records. It would be the 17th largest prime. So join us now and :help: us

em99010pepe 2004-12-01 19:13


Exciting times are ahead; We are now starting to test numbers bigger than the largest prime found by LLR. In fact our next prime will beat these records. It would be the 17th largest prime. So join us now and :help: us[/QUOTE]

I will join but I need to know If there is a stats page.



ET_ 2004-12-01 20:01

[QUOTE=em99010pepe]I will join but I need to know If there is a stats page.



Actually there is not even a web page :razz:

But we can work it out...


em99010pepe 2004-12-01 20:12

[QUOTE=ET_]Actually there is not even a web page :razz:

But we can work it out...


A stats page would attract more people. I think I am going to reserve a range.


paulunderwood 2004-12-01 20:16

:no: No statistics here at 321 :redface:

There is a [URL=]LLR status page[/URL] and the [URL=]reservations page[/URL] Also there is our [URL=]"bio page"[/URL] at UTM.

Welcome and may you have beginners luck!

em99010pepe 2004-12-01 20:27

[QUOTE=paulunderwood]:no: No statistics here at 321 :redface:

There is a [URL=]LLR status page[/URL] and the [URL=]reservations page[/URL] Also there is our [URL=]"bio page"[/URL] at UTM.

Welcome and may you have beginners luck![/QUOTE]

Thanks. First I am going to crunch a 321 range. Next I will finish my sieve range at PSP then back here with all my processing power.
I want to change a little bit.



jocelynl 2004-12-05 17:55

Hi Paul,

You can had this n as completed

[Sat Dec 04 22:56:12 2004]
3*2^8388607-1 is not prime. Res64: B93DC760D32B61C8 Time : 21727.275 sec.
Using Irrational Base DWT : Mersenne fftlen = 458752, Used fftlen = 524288
V1 = 4 ; Computing U0...done.

I'm working on n=8388607 sieved to 7T


paulunderwood 2004-12-06 20:45

21727.275 sec. is a little over six hours. Not bad for a 2 1/2 million digit number.

Numbers in the current 321 window take about an hour on a P4 -- numbers here have about 450,000 digits and have been sieved to 70 trillion.

I don't think you have much chance of finding a prime by yourself. You would be better off joining us again and testing some 321 numbers.

jocelynl 2004-12-07 03:55

[QUOTE]21727.275 sec. is a little over six hours. Not bad for a 2 1/2 million digit number.[/QUOTE]

Actually it took 2 1/2 days that's just the last 6 hours from a restart.

ok I'll give shut at it again


paulunderwood 2005-01-12 22:52

Come on in and join us -- we need more help :rant:

We are are fast approaching 1/2 million digit tests :shock:

From my calculations we are due to discover a new prime. :bounce:

Remember, with the new LLR3.5, Athlon users can compete with P4s. :showoff: :

paulunderwood 2005-02-03 08:50

:banana: we have reached 1/2 million decimal digits :nuke:

Thanks to all who have contributed over the past couple of years to get us to this level.

Again I say join us. :help:

Our three main pages are:




Now lets find the next 321 prime :devil:

paulunderwood 2005-05-27 16:59

New FFT size
We are about go to a new FFT size. This means the 321 numbers are going take more time to crunch.

32-bit Athlons user still have a few files left at the smaller size :help:

It it won't be long and we will have reached 2 million bit numbers :flex:

We are overdue in finding our next prime. We must be close. Join us now and have a good chance of finding a very big prime :showoff:

paulunderwood 2005-07-25 22:39

We have nearly cleared testing below n=2 million -- About a 10 days crunching on one of my machines should clear these.

The holiday season is here and a few of us are giving our search a rest -- but not me.

As always we need help running LLR on our files. Join in for the long run. We have sieved to 145 trillion and prime found now will be a top 20 one. Or just do one file on your computer if it has spare cycles for a week.

paulunderwood 2005-08-14 11:17

The n=1-2M range for 321 is now done :showoff:

We still need more help with LLR-testing in the range 2-3M. The candidates have been tested to 145,000,000,000,000 for divisors

Were at about 630,000 decimal digit tests and increasing...

paulunderwood 2005-08-29 23:20

Well, it's been a year since our last prime: 3*2^1232255-1 (30/08/2004) Thomas Ritschel

Thomas, Ryan Jackson and I have been busying our Athlons with sieving and should get upto 200 trillion by the end of the year :shock:

We should also be testing with LLR above 700,000 digits by the end of the year...

But where is our next prime? :rant:

We have plenty of DC work for you! Join us!

Our numbers are smaller than GIMPS -- a fast P4 can crunch a 321 candidate in about 2 hours -- A GIMPS number can take months :sick: Similarly with SoB...

Our numbers are now bigger than PSearch's ( n=2 million ), bigger than PSP's current reservations, bigger than Riesel Sieve's, and others...

In fact, our's is a really neat little project that can be easily accelerated by some more help with LLR testing.

Here are our main links:



:help: :help: :help:

Our next prime must be very close.

Join us by signing up to this forum and making "reservations", or email me directly to [email][/email]

OmbooHankvald 2005-09-01 19:35

Hi Paul

I've ripped your last message and placed it here: [url][/url] in the hope of aquiring some computing power from newbies.

I hope you're not :furious: at me for stealing your message.

paulunderwood 2005-09-04 07:30

:no: I am not angry with you in the slightest. We are are pleased with extra help from anyone.

I reckon by the end of 2006 we should have done so much sieving that the time taken to find a divisor will outweigh the time for a 321 LLR test at 4,000,000 bits. When this point is reached then it is all-out LLR'ing -- I'll be putting my Athlons on LLR tests. :cool:

paulunderwood 2005-10-21 20:56

We are upto 2.2 million bit tests. Thanks to everyone who helped get us here.

I have uploaded to the [URL=]reservation page[/URL] the latest blocks (2.2-2.35M) which have been sieved with NewPGen to 175 trillion. :w00t:

The new batch is split into two sections. The first is for 32-bit Athlons (2200000-2244000) and the rest is for "SSE2" enabled computers such as Pentium4s and Athlon64s. The 32-bit Athlons last a little longer at the old FFT length.

You will notice it takes longer to do a test at the new FFT length. Hence, on a fast P4 it will take about five and half days to crunch a file of sixty candidates.

Good luck everyone :smile:

paulunderwood 2005-10-25 00:44

I just got some timings at the new FFT length for my 3.4GHz P4 (DDR400) and things are better than I thought. It's taking just under two hours for a 321 test at 2.2 million bits. So an average file of 58 candidates is going to be done in under 5 days. :w00t:

My 1 GHz Athlon takes 28500 seconds i.e. nearly 8 hours, at the same 2.2 million bits (but at the old FFT size) :yawn:

paulunderwood 2005-12-16 16:26

I have just released new candidates: n from 2.35M to 2.5M :w00t: That'll keep us busy for a few months.

[COLOR=Red]Note to linkers to input files: Update your links please.[/COLOR]

The new candidates have been sieved to 208.81 Trillion and there will be no increase in FFT -- so they'll crunch at the usual rate.

There are still a handful of files left from the last release good for Athlons.

We are testing number with over 700,000 digits :showoff: and we still have a week in which to recieve an early Christmas present of a prime :wink:

For 2006, it would be cool to finish sieving and reach 900,000 digit LLR tests ...

paulunderwood 2006-01-06 02:53

We got the Christmas present we wished for :shock:

I'm hoping we and any newcomers can get the exponent "n" upto 3,000,000 by the end of 2006. By my calculations I'd say we are due to find a prime before n=3M. Exactly where the prime is hard to guess...

Sieving is nearly complete -- divisor=236,000,000,000,000 (236T) is our cut-off point. It is going to be more efficient to do LLR calculations on Athlons than it is to find new factors with NewPGen :showoff:

Join us :help:

paulunderwood 2006-02-26 21:02

I have just released the next batch for "n" between 2,500,000 and 2,700,000. This should keep us all busy for a few months. :wink:

Some blocks are just for 32-bit Athlons/PIII because of the difference in changes of the FFT levels between 32-bit Athlons and SSE2 enabled computers such as P4 and Athlon64. :geek:

Also I'm just a few weeks from finishing sieving after which I'll put my 32-bit Athlons and PIII on to 321 LLR work :razz:

We are still on target for 3 million bits by the end of this year.

Join us and stand a good chance of finding a top20 prime today!

paulunderwood 2006-03-02 00:34

Joe's just booked out the last range of a FFT size and this means P4 files are going to take a few more days to crunch at the new FFT size. But hey, if your file contains a prime, the prime will have more than three quarters of a million digits :w00t:

Athlon users have plenty of files too and a good chance of finding a prime.

paulunderwood 2006-03-12 07:24

I've updated the [URL=""]statistics[/URL] page to reflect the more time-consuming effort that was put into the old pre-DWT LLR which was when 321 was testing "n" less than 1 million. The stats do not have the logarithm taken any more. It should be easier to judge the relative effort 321 people have made :bow:

paulunderwood 2006-03-16 01:46

I've just finished sieving the last block for 321. :banana:

Thanks to Thomas, Scott and Ryan for their help sieving over the last three years. A special thanks to Paul Jobling for a great program, NewPGen, which I have had not the slightest hiccup using during that time. :bow:

In total I guess it's been about 50 GHz years computing :showoff:

All remaining 321 candidates, numbers less than one and half million digits, have been sieved to 236 trillion. This has been done using Athlons and is therefore super efficient for Pentium 4s etc running LLR

I've now got my PIII and Athlons running LLR -- it's all out 321 LLR here. I will post some timings in a few days.

We need all the help we can get. If you have a computer and a few weeks to run it then you'll have 60 odd chances of finding a prime with over three quarters of million digits :w00t:

All you need is:
the program LLR
a 321 input file
and to leave a message that you're doing it.

Good luck to all 321 prime hunters!

paulunderwood 2006-03-22 01:14

I will post some timings for our current testing window in a week or two, both for the P4s and Athlons. Athlons last at the "old" FFT size for a little longer than P4s. In the mean time as a rough guide: 2.4GHz P4 takes 4 and 1/2 hours per number and a 1GHz Athlon takes a little over 10 hours.

Join 321 :spam:

paulunderwood 2006-04-29 16:46

Happy Birthday 321
It's been three years since 321search was formed. Since then we have found 9 primes and a 10th should hopefully been found soon; The more help the sooner that will be. Anyway happy birthday to the project and it's members both past and present. :smile:

paulunderwood 2006-05-28 12:57

We are testing numbers greater than 800,000 digits now -- finding a prime here will be well up in the top 20 primes of the world today.

We still have several files suitable for testing on 32-bit non-sse2 computers such as AthlonXP.

On reservation of the few remaining see2-suitable blocks, I will release the next set of 150 files. This will keep us busy until the end of the year.

The target of n=3M by the end of this year has been revised to n=2.85M pushing back n=3M to mid-2007 and eventually reaching 1 million digits in 2008. Any help to bring forward these target dates will be greatly appreciated. :help:

paulunderwood 2006-06-09 12:37

New filles have been uploaded for n between 2.7M and 2.85M all sieved to 236 trillion.

A file of candidate 321 primes will take about a week on a fast P4 or twice as long on an Athlon. A file will on average give you 58 shots at finding a prime with over 800k decimal digits. A prime would be in the top20 of largest known primes.

To get started see our [URL=""]reservations page[/URL].

paulunderwood 2006-06-21 00:50

Soon, 321's [URL=""]first prime[/URL] will drop out of the top5000 :down:

On the bright side the next 321 prime to fall will be in the top5000 for another 5 years. :wink:

Currently we have done 54.45% of the work towards a 1 million digit :flex:

[URL=""]Help us find the next[/URL], which will have a very long shelf life :showoff:

paulunderwood 2006-08-30 21:09

As Thomas noted we're passed 2.8 million bits. This is about 840,000 digits.

The revised target is 3 million bits by the end of 2006. Hopefully we will get a prime as well. If no prime is found the next one will be even bigger.

You can [URL=""]help us[/URL].


paulunderwood 2006-09-24 11:26

We have a few returning participants and they are most welcome back. We're all upto around n=2.8M to 2.85M which is about 62.95% of the computing needed to get to one million digits tests (3,321,929 bits) :smile: It is going to be interesting to see how close we can get to 3 miilion bits by the end of this year. Tests will take about the same amount of time until the next FFT change at n=3121158. Then we will have a range of "n" of about only 200,000 at the that FFT length to get to 1 million digits and so we should easily reach 1 million digits next year. Hopefully we will find another prime soon aswell.

To join in see [URL=""]our reservation page[/URL]. :help:

paulunderwood 2006-11-24 04:59

We are now 70.91% towards one million digits. Also we have 30 files below n=3 million left in the [URL=""]reservation system[/URL]. These should all be booked out by the end of this year. We are also due to find a prime. Good luck everyone! :grin:

All times are UTC. The time now is 01:55.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.