![]() |
![]() |
#1 |
"Michael Kwok"
Mar 2006
117710 Posts |
![]()
I'm feeling a bit ambitious today, so I'm thinking of starting a new distributed computing project that searches for record twin primes. As of today, the largest one has an exponent of less than 180,000. So, why do I want a project to search for them?
1.) Doing things in a distributed way is a very effective use of computing power. Even if only 50 people join (less than one percent of the more popular projects out there), there'll be a LOT more computing power than what the 5 discoverers of the largest twin prime had. 2.) Nearly every prime searching project is distributed. The Mersenne prime search, Cullen prime search, Woodall prime search, Sierpinski problem, Riesel problem, 3*2^n search, etc. are collaboratively done by a large group of people. They all have great success, so a twin prime search should yield similar results. 3.) Faster results. An exponent from the Woodall prime search or the Sierpinski problem takes days to complete, but a test in the n=200,000 range takes just a couple of minutes. 4.) Different credit system. I'm NOT criticizing any one project in particular, but I've noticed that in many prime searching projects, only the lucky finder gets the fame/prize money/credit. For example, let's say you're testing 999*2^n+1 for n=150000-900000, but you don't find any primes. Then another lucky guy tests n=900000-901000, but finds a prime at n=900500. He/she gets that credit, but you don't. This doesn't happen often, but it's a possibility. In my project, the top two searchers will share the credit with the lucky finder. 5.) No unfamiliar software to download. To participate, you'll just need LLR or PRP, which many people already have. 6.) Unbeatable records??? If the twin prime conjecture is false (so there'll only be a finite number of twin primes), you might find the largest twin prime ever, which is a record that cannot be beaten. The chances of this are small, but wouldn't that be something nice to have? Last fiddled with by MooMoo2 on 2006-05-06 at 18:48 |
![]() |
![]() |
![]() |
#2 |
"Michael Kwok"
Mar 2006
11×107 Posts |
![]()
In theory, this is how it'll work:
There'll be a subforum for this project, with a thread that says "range reservations". There'll be a fixed n around 200,000, and you can receive ranges of k's. If you don't find a twin prime in that range, report your results and check out another one. If anyone has better ideas, please tell them :) Last fiddled with by MooMoo2 on 2006-04-13 at 23:00 |
![]() |
![]() |
![]() |
#3 |
I quite division it
"Chris"
Feb 2005
England
1000000111012 Posts |
![]()
If n was kept a few thousand above the 5000th prime then non-twin primes could be submitted.
|
![]() |
![]() |
![]() |
#4 |
Oct 2005
Italy
3×113 Posts |
![]()
Yes it seems to me a good idea.
Rather than a sub-forum , in my opinion, it would be better a web page with a table with the assigned ranges, completed ranges and free ranges. Are you sure that no better programs for prime twins exist ? What is the way of search ? We remove the numbers which have small factors , for example, of the form k* 2^n + 1 and k* 2^n - 1 then we test ONLY the probable primes with the same k and n (which could be twins) ? Is there a way to automatize this kind of search (testing ONLY possible twin primes and NOT single primes ) ? Last fiddled with by pacionet on 2006-04-14 at 18:48 |
![]() |
![]() |
![]() |
#5 | |
"Michael Kwok"
Mar 2006
117710 Posts |
![]() Quote:
The search method will be like this: 1.) I used the "twin search" feature in NewPGen to sieve out factors less than, say 10 trillion. For those numbers, k*2^n +1 and k*2^n-1 doesn't have a factor less than 10 trillion. The value of k changes, but the value of n does not. 2.) We'll put those numbers in PRP, which will search for primes of k*2^n+1 only. 3.) After a few days, we'll get a list of k's in which k*2^n+1 are prime. We'll then test those k's and see whether any of them are prime for k*2^n-1. If you know of any more efficient methods, feel free to discuss them. Unfortunately, I have a limited knowledge of HTML, so unless someone helps out, this project may be coordinated in a subforum for awhile. However, I do agree that this would be much better as a web page. |
|
![]() |
![]() |
![]() |
#6 |
Oct 2005
Italy
3×113 Posts |
![]()
Ok ... newpgen with that feature returns ONLY numbers for which (same k and same n) both k*2^n + 1 and k*2^n - 1 have no small factors ?
Last fiddled with by pacionet on 2006-04-14 at 20:44 |
![]() |
![]() |
![]() |
#7 |
I quite division it
"Chris"
Feb 2005
England
81D16 Posts |
![]()
Maybe it would be better to find the primes of the form -1 first as this is quicker. (Well it is here.)
Isn't LLR quicker than PRP? |
![]() |
![]() |
![]() |
#8 |
Oct 2005
Italy
3×113 Posts |
![]()
Ok ... but have you got any idea on what ranges have been yet checked ?
It would be useful to avoid ranges yet checked by someone in the world ... Anyway if you start this search I think that I can join for a while... how many chances to enter in the first places of largest twin primes ? Last fiddled with by pacionet on 2006-04-14 at 23:12 |
![]() |
![]() |
![]() |
#9 | |
"Michael Kwok"
Mar 2006
100100110012 Posts |
![]() Quote:
http://primes.utm.edu/top20/page.php?id=1 If you're interested in joining, download NewPGen from http://primes.utm.edu/programs/NewPGen/ and use the twin search feature to sieve n=195000 from k=100,000 (I've checked k values below that) to k=50,000,000 (or another large number near there). Click "verify results", and leave "include even values of k" unchecked. That may sound like a huge range, but after the sieving, only 1 out of 1000 candidates will remain. Then, put the remaining values into LLR or PRP, which will do the rest. If you do find a twin prime, it'll be the largest one known ![]() ---------------------------------------------------------- Flatlander, LLR is quicker than PRP for small k values, but (on my computer) there is almost no difference for higher k values, where most of the search will be (k>10 million). |
|
![]() |
![]() |
![]() |
#10 |
Mar 2004
5758 Posts |
![]()
Good Idea.
I was also searching for prime twins and triplets using k*2^n+-1. Interestingly, even for larger tuplets this form is more effective than primorial even the a priory chance is better with the latter. I prefer testing k*2^n+1 first. With LLR the test of a -1 and +1 takes about the same time and besides that there is still the chance to find a factor of a fermat number. Sieving makes the difference. You just need to sieve a alrge enough range at once with one fixed n (exponent). First you need to reduce the initial number of candidates by sieving in splitted large arrays (number of candidates = number of bits in the memory). later you can merge these presieved ranges and continue while the numbers are hold in a list or hashtable (at that time only one candidate has to be found in the list for every divisor, hence that is not the bottleneck). Newpgen provides these features, but the implementation is not yet optimal. I don't know if there are other sieving tools which are more efficient for twins. Let's be more concrete. That a twin appears in the top5000 list, an exponent of 250000 is necessary. Using a Range of 100G, 55Million candidates remain after sieving to 1T; 47Million Candidates after sieving to 10T and 35Million candidtes after sieving to 1Q (2^50). The speed of a fixed exponent sieve should not depend on the size of the range so sieving 33M per second on a fast computer could bring it to 1Q after one Year. The last half year (or bit) removes roughly 1800000 candidates or one candidate every 8-9 seconds. Hence sieving to 1Q makes sense (it can be done distributed after the first few T when the number of remaining candidates reduced enough) After sieving to 50bits the chance of a 250000 bit number is 1 every 2800. the chance for a twin is 1 in 7840000. Before a twin is found, the top5000 list might be flooded with 2800 new primes. The range of 100G contains about 4 to 5 twins (250000 bit numbers). with 300000 bit numbers we find about 3 twins (1 in 11Million), with 400000 bit numbers about 1.7 twins (1 in 20Million) and with 500000 bit numbers about 1.1 twin (1 in 31Million). If a prp Test takes 10 minutes the 100G range is completed after 660 Cpu years. Good Luck |
![]() |
![]() |
![]() |
#11 |
"Michael Kwok"
Mar 2006
117710 Posts |
![]()
biwema,
I'd prefer that for the first attempt, the n should be between 180,000 and 200,000. Although an n in the 250,000 range will get many primes into the top 5000 list, the problem is that the size of the 5000th prime constantly increases, so our new primes will be wiped off quite quickly. However, I do believe in having a much higher n (over 100,000 digits) after the first twin primes are discovered, so the primes will stay on the list for a longer time. I strongly agree with you on the sieving depth, though. |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
A Twin Prime Search Method | jzakiya | jzakiya | 12 | 2018-12-11 21:09 |
Highest Prime is also a twin prime... NOT | hydeer | Lone Mersenne Hunters | 9 | 2018-04-03 22:54 |
Twin Prime Days, Prime Day Clusters | cuBerBruce | Puzzles | 3 | 2014-12-01 18:15 |
Top-10 twin prime found! | gd_barnes | Riesel Prime Search | 6 | 2007-10-12 18:42 |
Questions about twin prime search | jasong | Twin Prime Search | 14 | 2006-12-14 23:16 |