View Single Post
Old 2007-12-12, 17:58   #4
akruppa
 
akruppa's Avatar
 
"Nancy"
Aug 2002
Alexandria

2,467 Posts
Default

This isn't an easy question. You'd want to compute the distribution of remaining factors given an a priori distribution (i.e. 1/n probability of n-bit factor), adjust by taking previous factoring effort into account and choose the ECM parameters so that the scalar product of that distribution and ECM's probability of finding factors of n bits, divided by the time ECM with those parameters would use is maximal. This is a bit messy.

This is what Silverman and Wagstaff described in the "Practical Analysis" paper (which I have to admit still haven't read as carefully as I should! ). As I understood it, the scheme of doing the expected number of curves for a certain factor size, then moving on to a size 5 digits larger is a good approximation to exactly this goal: maximising each curve's probability of finding a factor per unit time. If you want more precise parameter choice, all I can offer is code to compute ECM's proability of success for given parameters/factor sizes, so you can model the Bayesian process.

Alex
akruppa is offline   Reply With Quote