![]() |
|
|
#1266 |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
2·5,393 Posts |
I have some inaccurate data.
When shut down the ECMnet server had records of a complete t45 and roughly half of a t50. Bob is correct though: one of the known unknowns is how much extra work had been done other than through the client/server. It must be substantial because Bob himself found a goodly number of factors ranging from 34 digits in 2007 to 59 digits 6.5 years later. A fair guess is that at least a t50 has been completed. |
|
|
|
|
|
#1267 | |
|
Nov 2003
22×5×373 Posts |
Quote:
However, I must ask in response to this question: Why does it matter (exactly how much ecm has been done)??? Both jyb and I are plowing through them with SNFS. It doesn't matter to us how much has been done. Why does it matter to the OP? |
|
|
|
|
|
|
#1268 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
250428 Posts |
Quote:
The OP has to answer for himself but I could guess that he's trying to get an estimate of an appropriate B1 for starting his own ECM work. |
|
|
|
|
|
|
#1269 |
|
Nov 2003
1D2416 Posts |
It (further ECM) would be a waste of time.
|
|
|
|
|
|
#1270 |
|
"Carlos Pinho"
Oct 2011
Milton Keynes, UK
3·17·97 Posts |
Your posts clear my mind. I was in doubt where to allocate my resources, to ecmserver or to NFS@Home.
|
|
|
|
|
|
#1271 | |
|
Aug 2005
Seattle, WA
176710 Posts |
Quote:
|
|
|
|
|
|
|
#1272 | |
|
Aug 2005
Seattle, WA
3·19·31 Posts |
Quote:
You've said this before, notably as part of the discussion here. But you never did answer the questions I posed there, regarding why more ECM is pointless. So I'll boil it down and ask again: do you believe that the 2/9 rule of thumb is not an appropriate way of assessing when a number has received sufficient ECM to begin SNFS? If so, why, and what better metric can you suggest? As an aside I'll note that ECM pretesting did recently find a 52-digit factor of a number with SNFS difficulty 247, thereby saving a lot of computation, as described here. And yes, I know that we're talking about probability and expected values over many composites/factors here, so one example should not guide our policy; but that one example did make an impression on me vis-a-vis the value of ECM pretesting. |
|
|
|
|
|
|
#1273 | |||
|
Nov 2003
22×5×373 Posts |
Quote:
Quote:
This should be clear from Dickman's function. This "2/9" value should be a slow decreasing function of N (the composite). I have never analyzed the exact nature of this function, so I can not say how accurate it is for (say) 100, 150, 200, 250, .... digits etc. As the composites get larger once one has done an "initial ECM pass" to say the 50 digit level, the probability that there is a factor within ECM reach gets SMALLER. There is no "general rule" that applies uniformly to composites of all sizes. Instead, use the Bayseian methods I gave in my paper. Quote:
This suggests that your understanding of statistics is inadequate. Furthermore "saving a lot of computation" is an exaggeration. How much time was spent on ECM? How much time would SNFS have taken? Subtract. There is your actual savings. But the EXPECTED savings is much less because such small factors will be RARE. Stop FIXATING on this 2/9 "rule". |
|||
|
|
|
|
|
#1274 | |
|
Nov 2003
22×5×373 Posts |
Quote:
People get fixated on the ECM successes. They (perhaps) forget about all of the lost time spent when a factor was NOT found. However, SNFS succeeds with certainty. If one spends time to run SNFS, the time is never "lost". Suppose you spend time T with SNFS and get 3 factorizations. Suppose you spend the same time T with ECM and are able to test (say) 50 candidates to (say) t55. Unless you expect to find at least 3 factors with ECM, then you have wasted that time. One needs to assess the proability of succes at level t55 given the amount of effort already spent. If one failed at t50, it becomes less likely that one will succeed at t55, especially as the composites get larger. When one has already made a reasonable ECM effort (YMMV regarding 'reasonable') it is better to succeed with certainty via SNFS than waste further time with ECM. The exception to this guideline is of course the case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing". |
|
|
|
|
|
|
#1275 | |
|
Bamboozled!
"πΊππ·π·π"
May 2003
Down not across
1078610 Posts |
Quote:
For example, I have some systems which are quite incapable of running SNFS on the remaining HCN candidates because they do not have enough memory and/or mass storage. Although relatively slow they have (an admittedly small) chance of finding p5x or p6x factors by ECM. Another system has a GPU which is eminently suitable for running many ECM curves in parallel but completely unsuitable for NFS sieving and subsequent phases. Raw cycle counts are not the only thing of importance, despite CS people concentrating on them because counts are relatively easy to analyze mathematically. |
|
|
|
|
|
|
#1276 | |
|
Nov 2003
746010 Posts |
Quote:
"The exception to this guideline is of course the case where one lacks the resources to run SNFS. The alternative then becomes "run ECM or do nothing". " |
|
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| New phi for homogeneous Cunningham numbers | wpolly | Factoring | 26 | 2016-07-29 04:34 |
| Mathematics of Cunningham Numbers (3rd ed., 2002, A.M.S.) | Xyzzy | Cunningham Tables | 42 | 2014-04-02 18:31 |
| Don't know how to work on Cunningham numbers. | jasong | GMP-ECM | 6 | 2006-06-30 08:51 |
| Doing Cunningham numbers but messed up. | jasong | Factoring | 1 | 2006-04-03 17:18 |
| Need help factoring Cunningham numbers | jasong | Factoring | 27 | 2006-03-21 02:47 |