![]() |
|
|
#1145 | |||
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
170148 Posts |
Quote:
log2(3) is just over 1.5 (2^1.5 = 2 * sqrt(2) = 2.828). That would give you the exponent of the Mersenne number equal in magnitude to 3 raised to the power of the tenth primorial, approximately 10,000,000,000 -- beyond our current limits in prime95. Quote:
Quote:
Stage 1 of P-1 computes 3^(product of prime powers up to B1), which for B1=30 is 3^(2^4*3^3*5^2*7*11*13*17*19*23*29), which is 3^(tenth primorial)*3^360 So, even the default minimum B1 requires computing a number which is 3^360 times 3^(tenth primorial). 3^(tenth primorial) is itself larger than the current upper limit of Mersenne number that prime95 can handle. The point is that even the smallest B1 value we use requires use of modular arithmetic, so there's no point in trying to save numbers calculated in stage 1 for use on other exponents. There is no duplication (from one exponent to another) after as few as 8-10 terms. Last fiddled with by cheesehead on 2012-02-13 at 21:54 |
|||
|
|
|
|
|
#1146 | |
|
Romulan Interpreter
Jun 2011
Thailand
961110 Posts |
Quote:
![]() I told you so! Ha ha ha ha. Next time don't jump to byte the poor bcp guy... I bet every mathematician make this mistake at least 10 times in his life!
Last fiddled with by LaurV on 2012-02-14 at 04:43 Reason: link added |
|
|
|
|
|
|
#1147 |
|
Oct 2011
Maryland
2×5×29 Posts |
What James said above got me thinking. Is it possible to start a P-1 search using your save data and just push B1 or B2 out a little and not have to do most of the work.
Because that would make sense to me as what Never Odd or Even seems to do to get his mammoth P-1 GHz-d: http://mersenne-aries.sili.net/expon...ails=383838383 Just curious. |
|
|
|
|
|
#1148 | ||
|
"James Heinrich"
May 2004
ex-Northern Ontario
D4E16 Posts |
Quote:
Code:
Pminus1=N/A,1,2,595900037,-1,3000000,3000000,82 Pminus1=N/A,1,2,595900037,-1,3500000,3500000,82 Pminus1=N/A,1,2,595900037,-1,4000000,4000000,82 ... Pminus1=N/A,1,2,595900037,-1,6000000,6000000,82 Quote:
On a side note: doing P-1 this big with current hardware is silly.
|
||
|
|
|
|
|
#1149 | ||
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
11110000011002 Posts |
Quote:
3^(2^4*3^3*5^2*7*11*13*17*19*23*29), which is 3^(tenth primorial*360) = [3^(tenth primorial)]^360 So, even the default minimum B1 requires computing a number which is the 360th power of 3^(tenth primorial). Quote:
My calculation wasn't correcting any mistake bcp19 had made other than an apparent assumption that part of the stage 1 computation could be usefully saved for reuse on another exponent. I was supplying a computation that neither bcp19 nor I had (apparently) actually made yet. In fact, before I performed the computation for that posting I had expected that the maximum B1 at which a saving of partial computation could be useful would be higher than 30. So I surprised myself, too. The mistake I made in my calculation didn't invalidate the point I intended to communicate: that even the smallest B1 used in prime95 requires modular arithmetic, which makes it impractical to save a partial computation large enough to usefully speed up stage 1 for a different exponent. - - - - - - - - - - "Silly", along with "Miniscule", is some of our middle names here on mersenneforum.org Last fiddled with by cheesehead on 2012-02-14 at 17:25 |
||
|
|
|
|
|
#1150 |
|
Oct 2011
10101001112 Posts |
|
|
|
|
|
|
#1151 |
|
Apr 2011
in vivo
1138 Posts |
|
|
|
|
|
|
#1152 |
|
Romulan Interpreter
Jun 2011
Thailand
258B16 Posts |
hrrrr... hating anything involving bsp's (board support package for winCE and win mobile, system programmers know what I am talking about...)
edit: and yes, that is what I am doing, wasting my days and getting white hair... Last fiddled with by LaurV on 2012-02-15 at 03:44 |
|
|
|
|
|
#1153 |
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
22·3·641 Posts |
Just keep in mind all that nicely partitioned space you're producing, making the world a more-organized place. :-)-
|
|
|
|
|
|
#1154 |
|
Oct 2011
7·97 Posts |
I've been doing some reading, and have a few questions regarding P-1 and ECM. From the usage of them on the site, it appears to be a given that P-1 is more efficient than ECM for time spent vs factor found for exponents that were not factored during TF and prior to LL. So, while P-1 will find all factors 2kp+1 where the 2nd largest factor of k < B1 and the largest factor of k < B2, will the ECM also find all factors? (If I am asking this question badly, given that factor F is 22 digits long and can be found with B1=50000 using 300 curves[25 digit ECM], will F also be found using B1=250000 using 700 curves[30 digit ECM] and higher?) At what point is it better to use ECM for factorization and how do you determine what level of ECM to start at?
|
|
|
|
|
|
#1155 | |||
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
769210 Posts |
Quote:
Quote:
Quote:
By contrast, ECM's execution time has a lower proportionality to search area size, so at sufficiently high bounds its probability of finding a factor within a given time of execution beats P-1's probability by a larger and larger margin. (By "P-1's probability" here, I'm referring to the probability that a factor actually exists in the search area. P-1's certainty of finding a factor doesn't help if no such factor exists within the search area. OTOH, ECM's probability of finding a factor is [probability that a factor exists in the search area] * [probability that one of ECM's random curves will find a factor that does exist in the search area]) There are general guidelines for the optimum levels at which to switch from P-1 to ECM, but I don't have a link handy. Last fiddled with by cheesehead on 2012-02-27 at 23:49 |
|||
|
|
|