So the LL test does not win the "race to infinity" for finding ever larger primes (compared to other methods)?

[QUOTE=jinydu]So the LL test does not win the "race to infinity" for finding ever larger primes (compared to other methods)?[/QUOTE]
LL, proth, generalized fermat are all O(n^2 log n). The LL test has the smallest constant. Let's say that the LL test is 1.5 times as fast as the generalized fermat, and 4 times faster than the proth test. Your best algorithm to find a billion digit prime is then: 1) search for a 1 to 1.5 billion digit Mersenne prime. If none found: 2) start searching for billion digit generalized fermats too. 3) If you get to 4 billion digit Mersenne numbers and 2.6 billion digit generalized fermat numbers and still haven't found a prime, then start searching for billion digit Proth primes too. 
So if there are no Mersenne Primes with less than 1.5 billiion digits, it is the best thing to search for generalized fermat primes, becasue there are pleny of them. you can just square the basis and half the exponent, and the searching field gets much bigger.
If we estimate about 1000 cpu years on a average P4 for a single test, and a chance of 1 in 300Million I doubt that it is worth the electricity just for the price. But the fascination in that is somewhere else. 
[QUOTE=jinydu]So the LL test does not win the "race to infinity" for finding ever larger primes (compared to other methods)?[/QUOTE]
(another reply) No, the LL test always wins, compared to other methods, for [u]samesized numbers[/u]. But there are only three or four Mersenne numbers (not to mention that most have composite exponents) in any given range denoted by a certain number of decimal digits, whereas there are millions and millions of exactly1,000,000,000digit nonMersenne primes waiting to be discovered by methods other than LL. Then, as one extends LL testing to Mersenne numbers of more than 1,000,000,000 digits, the time required gets longer, and eventually exceeds the time required for a nonLL test of a 1,000,000,000digit nonMersenne number. After one factors in the differing predicted densities of primes, one finds a tradeoff point at which it's more costeffective to switch to an asymptotically slower method on smaller numbers. 
So if I ask for a prime with at least n digits in as small a time as possible, which method should I use as n approaches infinity?

[QUOTE=ET_]3,321,928,171 no factor up to 60.033 bits, k=179,309,108
Luigi[/QUOTE] Will you be taking this any further? Did you use your own program? William 
[QUOTE=wblipp]Will you be taking this any further? Did you use your own program?
William[/QUOTE] How about we have one person Advanced P1 it while another Advanced ECM 100 curves? If that doesn't work, someone could factor about 118 bits. We'll figure it out after that. :unsure: LLs would take 852 years, 209 days, 1 hours, 27 minutes for that exponent! 
[QUOTE=clowns789]How about we have one person Advanced P1 it while another Advanced ECM 100 curves? If that doesn't work, someone could factor about 118 bits.[/QUOTE]
Can we really do anything except trial factor? P1, P+1, and ECM all perform calculations modulo the billion digit number  I thought that would be a problem. Trial factoring requires calculations modulo the trial factor  a much smaller number. Will Prime95 AdvancedFactor work this large? It didn't work my machine. I think Luigi was using his own program. I was using Excel with the Zmath addin. William 
[QUOTE=wblipp]Will you be taking this any further? Did you use your own program?
William[/QUOTE] I used my program. If you think it could be useful, I will go further, just tell me the bitdepth you need (just remember my program is far slower than Prime95...) :smile: Luigi 
[QUOTE=ET_]I used my program. If you think it could be useful, I will go further, [/QUOTE]
[I]Useful?[/I] That's a pretty strong standard. Operation Billion Digits is a Children's Crusade, needing fifteentwenty years of continuing Moore's Law before the LL test becomes feasible. I thought it would be fun to continue a while further, though. And the only paths I see to advance Project Billion Digits are to advance the trial factoring of this candidate or to trial factor some other candidates. [QUOTE=ET_]just remember my program is far slower than Prime95[/QUOTE] What methods are you using to filter the k values before determining 2kp+1? We only need to test prime values of 2kp+1, so various sieving methods are possible. In my Excel spreadsheet I used a wheel of size 13# to automatically avoid k values where 2kp+1 had any divisors 13 or less, but larger wheels and/or other methods could make sense in a C program. In the spreadsheet, I noticed that occasionally the value of mod(2^p, 2kp+1) was a power of 2, and this leads to other Mersenne factors. You might want to consider checking for these possibilities and saving the results for later processing. 
Can you modify Prime95 source code? Perhaps it could be 23.9.

All times are UTC. The time now is 06:59. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2020, Jelsoft Enterprises Ltd.