Thread: Saving computation in ECM View Single Post
2004-06-03, 15:03   #5
R.D. Silverman

Nov 2003

26×113 Posts

Quote:
 Originally Posted by geoff I tried to work this out, I am not an expert so there may be some point I have missed: Say you have done the expected number of curves to bound B1=m (and saved the stage one residues), and the next higher digit size d has optimal bound B1=n. Let p,q be the probability of one curve finding a factor of size d with B1=m,B1=n respectively. Let r,s,t be the time to complete stage one to B1=m, stage one to B1=n, and stage two from B1=n respectively. Then it is worth continuing from the saved residue if (q-p)/(s-r+t) > q/(s+t), or equivalently if the ratio (q-p)(s+t) / q(s-r+t) is greater than 1. As an example I tried curves on 2^971-1 with a P4 2.66GHz and Prime95 (B2=100B1) using n=11e6, m=43e6. I got the probabilities p=1/79353, q=1/18461 by using Richard McIntosh's Maple program with factor size 50.5, and the resulting times were r=151, s=588, t=289 seconds. The ratio came to 0.927, so in this case it is better to start a fresh curve than continuing from the saved one.
Actually it is quite a bit more subtle than this. One also needs to consider
the probability that the group order will be smooth up to the new limits
given the conditional probability that it was NOT smooth up to the old
limit. This involves some very subtle problems in analytic number theory.
I am not sure, offhand, how I would go about solving them. It is also a
Bayseian problem because one must take into account the size distribution
of the desired factor as a prior.

My intuition [and I could be quite wrong] is that if a curve fails, it is
NOT worthwhile to save the output from Step 1 and then extend the B1
limit on the same curve. This intuition comes from the analysis of the role of
large primes in QS & NFS. Pomerance showed that if a large number N is
NOT B1 smooth, then one does not get too many additional relations by
allowing the introduction of 1 or two additional large primes. This can be
made exact via Dickman's function. Thus, if a curve fails to limit B1,
then by extending B1 to a new value B1' you are in essence looking for a
'large prime factorization' of the group order. The 'large prime' variation
only affects the o(1) term in the run time.

This (very rough handwaving!) argument is what gives me my intuition.

Take it for what it is worth.

Bob