![]() |
|
|
#23 |
|
Feb 2017
Nowhere
7·23·29 Posts |
As I mentioned before in a recent thread, I don't know much about ECM. However, I did notice that the paper FACTORIZATION OF THE TENTH FERMAT NUMBER has a discussion of the performance of ECM. Perhaps that will be helpful to keener minds than my own...
|
|
|
|
|
|
#24 | |
|
Aug 2002
Buenos Aires, Argentina
2×683 Posts |
Quote:
The execution time for processing a curve is approximately proportional to B1. The value fo B2 depends on B1 and the specific implementation (GMP-ECM is completely different from Prime95, so B2 is a lot greater in the former, thus requiring less curves). Normally the value of B2 is selected so the second step (the one which uses B2) takes about the same time than the first step (the one which uses B1). If you read the 25-digit column, you can use 56083 curves with B1 = 2000, 1589 curves with B1 = 11K and so on. In order to minimize the timing to find the factor, you have to minimize the product of the number of curves and B1. You can see that the minimum occurs for B1 = 50K. You will find 25-digit prime factors with other values of B1 (greater or smaller) but it will be slower. Last fiddled with by alpertron on 2017-09-29 at 16:29 |
|
|
|
|
|
|
#25 | ||
|
"Curtis"
Feb 2005
Riverside, CA
22·1,217 Posts |
Quote:
Quote:
A similar effect happens for B1=150M vs B1 = 110M, etc. |
||
|
|
|
|
|
#26 |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
1078010 Posts |
At the risk of being thought condescending, I believe those are consequences of the already acknowledged statement that a test to a Px level has a 1/e probability of missing a factor of x digits. The finding of a smaller factor is (or should be) obvious. It was missed with a probability of 1/e. The probability of finding a larger factor is less obvious. I'll leave finding a (relatively) simple argument as an exercise for the reader.
|
|
|
|
|
|
#27 | ||
|
Aug 2002
Buenos Aires, Argentina
2×683 Posts |
Quote:
Quote:
|
||
|
|
|
|
|
#28 |
|
"Victor de Hollander"
Aug 2011
the Netherlands
23×3×72 Posts |
I'm going to play devils advocate for a second as a layman non-math guy:
Would running P-1 (extenting current B1,B2 bounds) on (almost) 'virgin' ECM targets be a viable option for the purpose of the OP? Target of the OP is to find many factors with minimal effort (in GHzdays) on <10M exponents right? Let's take for example: 9105077 https://www.mersenne.org/report_expo...ll=1&ecmhist=1 TF 2^68 P1 (B1=110,000, B2=2,007,500) 3 ECM curves B1=50,000 (so almost no ECM done) Probably incorrect/nonsence following: { I thought P-1 was much faster than ECM (10-100 times in the case of mersenne numbers because of the 'free P' in the calculus?). So could that work out? Do P-1 with 100x the B1 bounds of ECM before starting ECM?? This of course doesn't take into account the already done 3 ECM curves and P-1 done: M9105077, factored to 68 bits P-1 with B1=5,000,000 and B2=50,000,000 Probability = 7.755637% Should take about 2.986382 GHz-days (using FFT size 480K) This is much better in terms of probability than the P-1 that was already done: M9105077, factored to 68 bits, with B1=110,000 and B2=2,007,500, using K*B^N+C = 1*2^9105077-1 Probability = 2.587171% Should take about 0.085515 GHz-days (using FFT size 480K) This compares to only about 7 ECM curves with B1=50,000 in terms of effort in GHzdays ECM M9105077 (curves=7, B1=50000, B2=5000000) = 2.94200439 GHz-Days } But as I said, that is probably nonsense (to stay in the RDS theme) .
|
|
|
|
|
|
#29 | |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
13·192 Posts |
Quote:
|
|
|
|
|
|
|
#30 |
|
Romulan Interpreter
Jun 2011
Thailand
100101101110012 Posts |
|
|
|
|
|
|
#31 | |
|
Jun 2003
508210 Posts |
Quote:
|
|
|
|
|
|
|
#32 |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
469310 Posts |
|
|
|
|
|
|
#33 | ||
|
Mar 2006
479 Posts |
Quote:
Quote:
From the table rcv posted, I understand that the chance of finding a 25-digit factor with B1=50k is 1/204. And the chance of finding a 25-digit factor with B1=250k is 1/47. The following may be a bit of a stretch, but it is my current understanding of RDS's paper. I see the equation: \(1 - [1 - P(B1,B2)]^L\), where P(B1,B2) is the probability of success and L is the number of curves. If I'm reading this correctly, this gives us our chance of success after running L curves at a particular B1/B2 value. So, if we run 1 curve at B1=50k, we get: \(1 - [1 - (1/204)]^1 \approx 0.0049\), ie a 0.49% chance of success, or a 99.51% chance of failure. If we run 1 curve at B1=250k, we get: \(1 - [1 - (1/47)]^1 \approx 0.0213\), ie a 2.13% chance of success, or a 97.87% chance of failure. How do we combine these numbers? Multiplying the successes together doesn't seem right, that seems to give us less chance of success. Multiplying the failures together seems to go in the right direction, since 99.51%*97.87% gives us 97.39%, which means our chance of success is 2.61%? Is that right? Assuming the above is correct, then running the above like this: 20 curves at B1=50k: \(1 - [1 - (1/204)]^{20} \approx 0.0936\), ie a 9.36% chance of success, or a 90.64% chance of failure. 20 curves at B1=250k: \(1 - [1 - (1/47)]^{20} \approx 0.3496\), ie a 34.96% chance of success, or a 65.04% chance of failure. And to combine those we would multiply 90.64%*65.04% to get 58.95% chance of failure, which means our chance of success is 41.05%. Does this sound right? |
||
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Odds of Finding a Factor | Gordon | Factoring | 18 | 2015-09-20 20:33 |
| how much ECM without finding an existing factor | dbaugh | PrimeNet | 4 | 2013-01-11 16:31 |
| p-1 testing ram vs finding a factor | crash893 | Software | 11 | 2006-07-03 21:48 |
| Probability of finding a factor | JuanTutors | Software | 20 | 2004-09-26 09:47 |
| possibility of finding a factor | there_is_no_spoon | Math | 10 | 2004-03-11 20:05 |