mersenneforum.org  

Go Back   mersenneforum.org > Other Stuff > Archived Projects > 3*2^n-1 Search

 
 
Thread Tools
Old 2003-04-26, 11:07   #23
TTn
 

2×2,003 Posts
Default

If no one opposes this statement:

"If p divides k, then p does not divide k*2^n-1. This means that N=k*2^n-1 is prime with probability k/phi(k) * 1/ln(N) instead of probability 1/ln(N).


Then I would declare 15k*2^n-1 a viable search for the future, due to the evidence of first 1001 k listed on the k*2^n-1 search. And also I have probed k<5000, n<5000 with precise results 15k produces the most primes! There are exceptions to the rule but so far this is the evidence for little k (odd).

Please counter this if you can! :idea:
 
Old 2003-04-26, 12:03   #24
TTn
 

763210 Posts
Default

I'd like to add that 3 is a great candidate to test.
I had previously doubted testing k=3, but I'm hip.
 
Old 2003-04-28, 17:12   #25
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

2·5·467 Posts
Default

Yes, Shane, three is cool 8)

The sieving is up to 'p' equals 3.9 trillion. Unfortunately I had a power cut/outage and lost the rate at which my Athlon 1GHz was eliminating candidates. However, over about 160 hours it eliminated 184 candidates -- on average one every 52 minutes. I have set off the sieving computers on another 1.5 trillion.

I have had a word with Paul Jobling ( who wrote NewPGen, the sieve I am using ) and confirmed with him that I can increase sieving thoughput on my slow Athlon and two relatively quick AthlonXP's by about twenty per cent. This is achieved by putting the highest range of 'p' on the slowest computer and the other two lower ranges of 'p' on the quicker computers. As soon as the two XP's have finished I can stop the slow computer and perform the merging of the files.

All blocks upto 'n' equals 420,000 have been assigned. Please email me ( paulunderwood@mindless.com ) to join in the search. I feel a prime find is imminent... ;)
paulunderwood is offline  
Old 2003-04-30, 05:49   #26
TTn
 

32·337 Posts
Default

8) I would still like to discuss this cut-off time. :?
At the point where you are no longer pulling composites = to the testing time of a given range, then stop newgpen, start LLR or PRP, test that range and delete that range from newgpen, and resieve file.

s = Number of sieved n, so far.
t = Rate of sieved n, in seconds.
r = Number of sieved n, of a given range so far.(min-max)

(s*t)/r= cut off NewGPen

The appropriate ranges are however the real question
These probably aren't the best cut-off ranges, but what is?

/ 3*2^400011-1 Time : 2364.714 sec.
r
\ 3*2^500006-1 Time : 3861.034 sec. When (s*t)/r=3861 sec. cut.

3*2^600023-1 Time : 5595.876 sec.
3*2^700003-1 Time : 11707.696 sec.
3*2^800023-1 Time : 13288.771 sec.
3*2^900000-1 Time : 18232.217 sec.
3*2^999988-1 Time : 23783.626 sec.


(min)*1.618033988749...^r ?
 
Old 2003-04-30, 21:20   #27
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

2·5·467 Posts
Default

I am inclined to agree more with Sander ( smh ) than you on the method of sieving. It seems better to keep candidates in the sieve for much longer than it first appears, so much so that it maybe better to organise a bigger sieving effort than the current one .

( Firstly let me make the assumption that candidates a uniformly random over 'n'.)

Suppose we decrease the range to a quarter of the original; Here the time taken decreases only by 0.5 ( i.e the square root -- as explained in NewPGen help on fixed-k searches.).

As a second example, if we decrease the original range to a sixteenth of the original it will only take a quarter of the time.

In my last report about the sieving status of 'n' for 400,000 to 1,000,000 I said I had removed 184 candidates in 160 hours which is about 52:17 minutes. Had I only sieved 420,000 to 1,000,000 I would have decreased the time taken to sqrt(58/60)*160 which is about 157.5 hours but would have eliminated (58/60)*184 candidates i.e. about 178 candidates. In the extra time gained ( 160-157.5 = 2.5 hours ) we could have tested with LLR an extra 3.8 candidates ( based on earlier given timings .) Summarising, by leaving the candidates in the seive we can eliminate 184 candidates, but by doing the lower 2/60 by LLR and leaving the rest in the sieve we would eliminate 181.8 candidates; 2.2 fewer

On the bright side, I see sieving taking me another 4-6 weeks on my three Athlons. If anyone with experience of using NewPGen and has an AthlonXP ( or better ) or more to help me do some sieving please contact me ( paulunderwood@mindless.com). I am sieving 'p' in blocks that take an AthlonXP 1 week to complete -- Saturday to Saturday.
paulunderwood is offline  
Old 2003-05-01, 09:32   #28
TTn
 

2·43·113 Posts
Default

I assume random placement as well.
But now I strongly disagree, due to many trials over the past few days.
There is a dual benefit, by testing the first range when it is ripe, rather than rotten, frees up cycles for the main file you cut from, which is now smaller and more edible as well.
Any given range of n,(min/max) should not differ more than twice as long as the minimum time of that range. The main large file does not loose it's benefits, to become shorter as time allows after it has done it's job in a range so that differs not from Sander's idea.
The large file is the way to start.



I can't see how you justify wasting much more than 4728 seconds to pull a composite out of this first range? Why allow the sieve to spend up to 23,783 seconds to pull one out? Are you suggesting sieving until 23,783 second per n, for the entire file at once? ops:

/ 3*2^400011-1 Time : 2364.714 sec. min
r
\ ~ 3*2^55000-1 Time: 4728 sec.


" 9456 sec
" 18912 sec

3*2^999988-1 Time : 23783.626 sec.
 
Old 2003-05-01, 10:02   #29
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

110768 Posts
Default

I am sorry for not explaining when I will stop sieving. I plot the timings I gave earlier over 'n'. Then I calculate the average time by estimating where half the area falls. Then I add a little for good measure! Therefore my cut off time will be somewhere in the region of 3:30-4:30 hours.

3*2^800023-1 Time : 13288.771 sec.
paulunderwood is offline  
Old 2003-05-01, 10:29   #30
TTn
 

2,791 Posts
Default

Check the rate of those n, being pulled between 400000-550000.
What if it was consistenly pulling them no faster than 4728 sec from here?
 
Old 2003-05-01, 12:03   #31
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

2×5×467 Posts
Default

Suppose we are eliminating a candidate from the range 400,000-550,000 every 4728 seconds.

Method 1. We leave all the numbers in the sieve which means we remove one candidate from the whole range every 4728/4 seconds = 1182 seconds.

Method 2. We remove the range 400,000-550,000 from the sieve and continue sieving the rest. The time taken to sieve 0.75 of a candidate is sqrt(3/4)*1182 seconds = 1023 seconds. Assuming the average time for LLR on the range 400,000-550,000 is 3800 seconds, with the extra 158 seconds saved in sieving we can test 158/3800 of a candidate i.e. 0.04 of a candidate. Overall in 1182 seconds we would have eliminated only 0.79 of a candidate.
paulunderwood is offline  
Old 2003-05-06, 07:11   #32
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

10010001111102 Posts
Default

Just a quick line to let you know how the sieving is progressing. All 'n' greater than 430,000 are being sieved. There are 37,318 'n' in the sieve. The greatest divisor tested is 5.4 trillion. In about 170 hours my 1 GHz Athlon eliminated 130 candidates -- about 78:28 minutes per elimination. The target for elimination on this computer is 42 candidates a week.
paulunderwood is offline  
Old 2003-05-12, 17:27   #33
paulunderwood
 
paulunderwood's Avatar
 
Sep 2002
Database er0rr

2·5·467 Posts
Default

All candidate 'n' less than 445,000 have be assigned to our members for testing. There are 36,344 candidates left in the sieve which has reached a maximum divisor of 6.9 trillion. My 1 GHz Athlon eliminated 69 candidates in about 150 hours i.e. one every 2:10 hours. Remember the target is about one in 4:00 hours. My three Athlons are now sieving another 1.5 trillion range of divisors.
paulunderwood is offline  
 

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Forum Donation and Expense List Xyzzy Lounge 155 2023-06-01 14:05
link to status table paulunderwood 3*2^n-1 Search 2 2004-05-17 22:13
Status of Fermat Search rogue Factoring 13 2004-05-01 14:48
Interesting link... Xyzzy Hardware 0 2003-08-21 00:06
Nice link... Xyzzy Lounge 4 2003-06-28 13:37

All times are UTC. The time now is 14:12.


Fri Jun 9 14:12:22 UTC 2023 up 295 days, 11:40, 0 users, load averages: 0.85, 1.00, 1.01

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔