mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Factoring (https://www.mersenneforum.org/forumdisplay.php?f=19)
-   -   Now what (IV) (https://www.mersenneforum.org/showthread.php?t=11661)

bdodson 2009-04-12 15:39

[QUOTE=fivemack;167628]Another t55 would definitely be a worthwhile contribution, thanks very much for the offer.[/QUOTE]

Last of the scragglers are in, second pass is 10,446 curves, B1=260M
(default B2). That's past 6*t50, with t55 somewhere between 5.0-5.7
t50s. A full 2t55 with the first pass of 7t50, which (if I recall) meets
an 80% chance of finding a p55, for Peter's term of "removing" these
(uhm, so, still a 1/5 that we're supposed to take as an acceptable risk).

Don't think that we'll see a p52/p53. No promises about p59/p60. -bd

R.D. Silverman 2009-04-14 14:20

[QUOTE=bdodson;169033]Last of the scragglers are in, second pass is 10,446 curves, B1=260M
(default B2). That's past 6*t50, with t55 somewhere between 5.0-5.7
t50s. A full 2t55 with the first pass of 7t50, which (if I recall) meets
an 80% chance of finding a p55, for Peter's term of "removing" these
(uhm, so, still a 1/5 that we're supposed to take as an acceptable risk).

Don't think that we'll see a p52/p53. No promises about p59/p60. -bd[/QUOTE]

Hi Bruce,

My general opinion is that unless you want to do some specific
ECM pre-tests on an NFS candidate, that Cunningham numbers
with difficulty under 250 are probably not worth any further
ECM effort.

You may want to either help out on the Fibonacci/Lucas numbers
of low index (say n < 1500)

OR

I have a very small number of Homogeneous Cunningham numbers for you
to test, if you have the time. See

[url]http://www.chiark.greenend.org.uk/ucgi/~twomack/homcun.pl[/url]

The following site contains the partial (known) factorizations:

[url]http://www.leyland.vispa.com/numth/factorization/anbn/main.htm[/url]

The numbers we need NFS pre-tested are:

3,2,457-
3,2,499-
3,2,482+
3,2,494+
3,2,496+
3,2,499+

bdodson 2009-04-14 20:08

[QUOTE=R.D. Silverman;169202]Hi Bruce,

My general opinion is that unless you want to do some specific
ECM pre-tests on an NFS candidate, that Cunningham numbers
with difficulty under 250 are probably not worth any further
ECM effort.
[/QUOTE]

If you scroll up a bit you'll see [code]
This one is C178 with difficulty 264. [/code]

Not difficulty under 250. You might want to check the recent
Batalov + Dodson snfs factorization of 2,1618L, with a p51.
The numbers in c190-c233 of difficulty below 250 have gotten
4*t50; while the ones above 250 (not as likely for immediate
sieving?) had 3*t50. For 1618L, it was on the wrong side of
c233, in c234-c250, which has only had 2t50. If you're routinely
sieving ones with courve counts so far below t55, you're certainly
going to hit some more p51-p54's.

I'm currently working on near-term sieving candidates with
difficulty 263 and 268. The ones above c250 have hardly had
any ecm at all. I don't regard these curves as ecm "factoring".
but rather as a precomputation for sieving, ecm "pre-testing",
and have a full schedule of such Cunningham candidates. No
factors, much at all, but I might hope to occasionally hit an
early p59/p60. -Bruce

FactorEyes 2009-04-14 20:15

[QUOTE=R.D. Silverman;169202]
The numbers we need NFS pre-tested are:

3,2,457-
3,2,499-
3,2,482+
3,2,494+
3,2,496+
3,2,499+[/QUOTE]

<WRONG>I believe that 3,2,494+ has been factored completely...</WRONG>

I put 7550 curves at 43e6 into the remaining 5 on this list (Batalov is keeping 3,2,494+ warm for us). That's t50. This is not enough, of course. I assume that Paul Leyland's ECM server has also shown these plenty of love at 3e6 and 43e6.

I might plow through some more ECM on these 3,2 candidates a few weeks from now. If anyone has suggestions for how many curves would be optimal, I would be glad to hear them.

Batalov 2009-04-14 20:23

[quote=FactorEyes;169248]I believe that 3,2,494+ has been factored completely, but Paul's site is down, so I can only go by the fivemack reservations page at chiark.greenend.org.uk.

I put 7550 curves at 43e6 into the remaining 5 on this list, so that's t50. This is not enough, of course.[/quote]
3,2,494+? Not yet. It's stagnating on my disk. It is next in my queue, after 5,362+ :smile:
I am pretty much done with 11+2,199 though. These are my two liabilities.

R.D. Silverman 2009-04-15 14:37

[QUOTE=bdodson;169247]If you scroll up a bit you'll see [code]
This one is C178 with difficulty 264. [/code]

Not difficulty under 250. You might want to check the recent
Batalov + Dodson snfs factorization of 2,1618L, with a p51.
The numbers in c190-c233 of difficulty below 250 have gotten
4*t50; while the ones above 250 (not as likely for immediate
sieving?) had 3*t50. For 1618L, it was on the wrong side of
c233, in c234-c250, which has only had 2t50. If you're routinely
sieving ones with courve counts so far below t55, you're certainly
going to hit some more p51-p54's.

[/QUOTE]

I don't regard finding a factor with NFS in the 51 to 55 digit range
to be a problem on these smaller numbers. The *expected*
time to factor the smaller numbers is much less with NFS than with
ECM.

fivemack 2009-04-15 15:30

[QUOTE=R.D. Silverman;169350]I don't regard finding a factor with NFS in the 51 to 55 digit range
to be a problem on these smaller numbers. The *expected*
time to factor the smaller numbers is much less with NFS than with
ECM.[/QUOTE]

The expected time to factor any large number with ECM is meaninglessly enormous, because t(find 70-digit factor by ECM) is so large and p(70-digit factor) quite big, so it's a matter of picking your early-abort strategy. I have an entirely unjustified 'ECM for about 25% of the time the NFS job will take', each of the twenty thousand curves done would take about an hour and I'm fairly sure the sieving will take around sixty kilohours.

Bother. I have now convinced myself that 2^877-1 is a GNFS number: 10^263-1 took ~170 megaseconds, 109!+1 looks as if it'll take ~120 megaseconds, and 2^877-1 is only a little bigger in the relevant sense than either of those.

I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?

10metreh 2009-04-15 15:33

[quote=fivemack;169360]I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/quote]

What about 2801^79-1? (a bigger one this time)

[b]fivemack:[/b] it's more than 255 digits long so would need recompiled sievers - any oddperfect-search number will be of length basically equal to its SNFS difficulty, because the roadblocks are of the form 'we know no factors of sigma(a^b)'

bsquared 2009-04-15 15:41

[quote=fivemack;169360]I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/quote]

Just hunting around through the first 5 holes and cross-checking with what is already spoken for, I see 10,268+ at 243 digits and difficulty 269...

fivemack 2009-04-15 16:00

That's the right sort of size and shape - but is anyone interested in it?

R.D. Silverman 2009-04-15 16:25

[QUOTE=fivemack;169360]The expected time to factor any large number with ECM is meaninglessly enormous, because t(find 70-digit factor by ECM) is so large and p(70-digit factor) quite big, so it's a matter of picking your early-abort strategy. I have an entirely unjustified 'ECM for about 25% of the time the NFS job will take', each of the twenty thousand curves done would take about an hour and I'm fairly sure the sieving will take around sixty kilohours.

Bother. I have now convinced myself that 2^877-1 is a GNFS number: 10^263-1 took ~170 megaseconds, 109!+1 looks as if it'll take ~120 megaseconds, and 2^877-1 is only a little bigger in the relevant sense than either of those.

I'd better figure out some polynomial-search parameters, or some indisputably SNFS number of less than 255 digits and difficulty around 265. Help?[/QUOTE]


My joint paper with Sam Wagstaff: "A Practical Analysis of ECM"
gives an exact abort strategy for when to shift from ECM to QS/NFS.

One combines the sample data obtained from ECM failures, perhaps
performed at different B1,B2 values, with the known a-priori distribution
of factors given by Dickman's function. (or any other approximation to
the distribution of factors). One uses Bayes' Theorem to derive a
posterior distribution and computes the *expected value* of the posterior.
If the time to find a prime near the "expected value" via ECM with p=1-1/e
exceeds the time it would take NFS, then switch. This is based upon
using the unit-linear loss function combined with minimizing the expected
cost to achieve the factorization. The unit-linear loss function simply
applies a linear cost function to the cost of being wrong, under the
assumption that computer costs are a simple linear function of the CPU
time that is spent.


All times are UTC. The time now is 22:04.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.