![]() |
|
|
#1 |
|
"Adolf"
Nov 2013
South Africa
61 Posts |
I've looked around to see if I can answer my own question, and now I've confused myself even more
![]() I've chosen a random exponent with no factor and trying to find at least one factor for it. I'm doing TF and P-1 on it, increasing the limits when no factor is found. However, I'm not sure whether the P-1 B1 bound(s) I'm using is just a waste of time. So my question - is there a relation between TF range and P-1 B1 bound, i.e. (just example) if TF has been done to 2^74 it is a waste of time to do P-1 with B1 bound 2,000,000. P-1 should then be done with B1 6,000,000 Adolf Last fiddled with by houding on 2014-09-05 at 08:02 |
|
|
|
|
|
#2 |
|
Romulan Interpreter
Jun 2011
Thailand
26×151 Posts |
Welcome to crunching!
The two methods, P-1 and TF, look into different "spaces" for factors. Very small factors can be found by both TF and P-1, but generally, factors found by one method will not be found by the other. TF will find a factor q if this q is smaller than some limit. We specify the limit in "bits", i.e. TF to 74 bits will find a factor if it is smaller than 2^74. The biggest problem with TF is that the time goes up with the limit: if the limit doubles (i.e. the "bitlevel" increases by 1), then the number of candidates (numbers that can be factors) doubles, so the time to test them all doubles. (At least doubles; here we can discuss why the time gets more than double when the bitlevel raises by 1). In the same time, the chances to find a factor will decrease per each bitlevel. Only (roughly) one in 74 mersenne numbers have factors of 74 bits. And only (roughly) one in 80 mersenne numbers have factors of 80 bits. This because the prime numbers get "rare" if their size increase. After a limit, it is not profitable anymore to do TF, because you spend double amount of time, for a lower chance to find a factor. P-1 (stage 1) will find a factor q=2kp+1 of some Mp, if k is "power smooth", i.e. it can be written as a product of power of primes, If we do P-1 stage 1 with B1=52, this factor will not be found. With 53 (and higher), it will be found. Now, you know, the most of the numbers are *not* smooth. If you make a small scrypt to see how many numbers are "B1-smooth" below 2^74, with B1=10^6 (one million), you will not turn out with so many! (think all combinations of primes smaller than a million, such as their product is lower than 2^74 - they are "a lot", but still few, comparing with 2^74). Only a very small part of those 2^74 numbers will be smooth. The most of them are "out" of the "smooth set", and assuming some mersenne factors lay under 2^74 (a lot of them do!), they have more chances to be found by TF to 74 bits (well, bad example, the chance is 1), than to be found by P-1 stage 1 with B1=1M. The chances of P-1 to find factors of a specified size lies in the distribution of the smooth numbers into the number set, for that specified size. Going up with B1 increases the chances to find the factors, but it is also more work to be done. You can have a look at mersenne.ca, there is somewhere a form to compute the chances to find factors by P-1, which also considers how much TF was done for the number (and viceversa). The formulas are also available, and would be useful for you if you are interested to dig deeper about this subject. Last fiddled with by LaurV on 2014-09-05 at 10:58 |
|
|
|
|
|
#3 | ||
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
10000101010112 Posts |
Quote:
Note that when doing ordinary pre-LL factoring attempts, you should (usually) only do P-1 once, so this isn't an issue. From undoc.txt (note that the default is what you want, not the setting shown there): Quote:
Last fiddled with by Mini-Geek on 2014-09-05 at 12:22 |
||
|
|
|
|
|
#4 | |
|
Nov 2003
22·5·373 Posts |
Quote:
Repeatedly re-running TF/P-1 with increasing bounds is quite inefficient. Run ECM instead. |
|
|
|
|