mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Factoring (https://www.mersenneforum.org/forumdisplay.php?f=19)
-   -   Where is P-1, P+1 effort recorded? (https://www.mersenneforum.org/showthread.php?t=2438)

geoff 2004-05-05 03:22

Where is P-1, P+1 effort recorded?
 
Is there any record of how much P-1 and P+1 effort has been applied to the Cunningham numbers?

If not, does anyone have a guess? Is it reasonable to assume the base two numbers have all had P-1 done at least to the Prime95 limit B1=4.29e9?

xilman 2004-05-05 08:34

[QUOTE=geoff]Is there any record of how much P-1 and P+1 effort has been applied to the Cunningham numbers?

If not, does anyone have a guess? Is it reasonable to assume the base two numbers have all had P-1 done at least to the Prime95 limit B1=4.29e9?[/QUOTE]
Paul Zimmermann and/or Alex Kruppa have done a [i]lot[/i] of P-1 and P+1 on the Cunningham tables. I don't have figures to hand, but I know Alex posts here on occasion.

Paul

akruppa 2004-05-05 11:04

Most of the P-1 and all of the P+1 I've done myself was either on Fibonacci-/Lucasnumbers which Blair Kelly keeps track of on his [URL=http://home.att.net/~blair.kelly/mathematics/fibonacci/]web site[/URL], or mostly on Cunninghams before factoring them with NFS.

AFAIK Paul Zimmermann and Torbjorn Granlund have done a lot of P-1 and P+1 on Cunningham numbers, Torbjorn on base 10 numbers and Paul on other bases afaik. Paul recently told me in an email:

[QUOTE="Paul Zimmermann"]I'm testing all (regular) Cunningham composites (1173 currently) by
increasing size. For P-1, I've finished a complete run at B1=1e9, and
I'm doing a 2nd one at B1=4e9, currently at 151 digits.

Another machine is doing a P+1 test (3 curves each) at B1=1e9, and is currently
at 190 digits.[/QUOTE]

Base 2 numbers may already have been tested to higher bounds than that with Prime95, George Woltman would probably be the one who keeps track of the effort.

Alex

R.D. Silverman 2004-05-05 13:34

[QUOTE=akruppa]Most of the P-1 and all of the P+1 I've done myself was either on Fibonacci-/Lucasnumbers which Blair Kelly keeps track of on his [URL=http://home.att.net/~blair.kelly/mathematics/fibonacci/]web site[/URL], or mostly on Cunninghams before factoring them with NFS.

AFAIK Paul Zimmermann and Torbjorn Granlund have done a lot of P-1 and P+1 on Cunningham numbers, Torbjorn on base 10 numbers and Paul on other bases afaik. Paul recently told me in an email:



Base 2 numbers may already have been tested to higher bounds than that with Prime95, George Woltman would probably be the one who keeps track of the effort.

Alex[/QUOTE]

Back about 15 years ago I ran P-1 on all Cunningham base 2 numbers. I
used a lower B1 limit (10^7), with a much higher B2 limit (10^13). I was
testing out my FFT implementation of Step 2 on an Alliant FX8 using 4 of
its 68020 processors. I also ran the smaller unfactored Fermat numbers.

Wolf 2004-05-05 18:41

[QUOTE=geoff] Is it reasonable to assume the base two numbers have all had P-1 done at least to the Prime95 limit B1=4.29e9?[/QUOTE]

[B]Yes[/B]

According to PMINUS1.TXT (see bottom of the page of [url]http://www.mersenne.org/status.htm[/url]) all base two numbers <10000 (those with no known factor anyway) have been tested to B1=4.29E9

geoff 2004-05-07 08:14

Thanks for the information, I don't think I will do any P-1 or P+1 on the Cunningham numbers at this time, but just continue with ECM.

geoff 2004-05-07 08:36

Another thought about P-1 testing:

Say I was to do a very large P-1 stage one, say on M(2^19) with B1=4.29e9 which would take me about 2 months. If the test finished and the final GCD found all the expected known factors (those known P where the largest factor of P-1 is less than 4.29e9) would that be a good enough check that no error had occured during the test, or is there some way that an incorrect residue could still lead to the known factors being found?

akruppa 2004-05-10 15:45

Yes, that would be a very powerful check to ensure that the residue did not get corrupted during the computation. In fact, it should suffice to leave one resonably sized (and smooth enough) prime factor out of low[mp].txt and check that that one is found.

A factor [i]p[/i] is found in stage 1 if the order of the starting element of P-1 has only prime factors that are included in stage 1, i.e. that are <=B1. If an error occurs somewhere during the computation, we can look at it as another P-1 factoring attempt with a random starting element which gets exponentiated by the stage 1 primes remaining at that point. So if the error occurs while we are processing the prime [i]q[/i], the factor will still be found if the corrupted residue happens to be a [i]k[/i]-th power where [i]k[/i] is the product of all prime factors of [i]p[/i]-1 smaller than q. The probability of a random residue being such a power is 1/[i]k[/i]. So unless the error occurs right at the start of the computation, k will usually be large enough to make an "accidental" discovery of [i]p[/i] in spite of the error very unlikely.

Alex

philmoore 2004-05-10 23:13

Actually, I have had apparent errors doing P-1 on large, highly composite exponents, most recently on M9699690. For example, in October 2003 (this was with version 23.4.1) I ran stage 1 to a bound of B1=2000 and stage 2 to B2=200,000. (I found a number of factors at the same time which I then factored using Dario Alpern's Java applet.) But then I ran stage 1 to B1=100,000 and found the following two factors:
320731441801 = 2^3 * 3 * 5^2 * 7 * 11 * 17 * 19 * 21493 + 1
119609902307611 = 2 * 3^2 * 5 * 11 * 13 * 17^2 * 19 * 59 * 28687 + 1
My question is, why didn't these factors show up earlier when I ran stage 2 to 200,000?
Then again, I continued running stage 2 to B2=10,000,000 and turned up:
150747052048811 = 2 * 5 * 7 * 17 * 19 * 307 * 509 * 42667 + 1
So why did this factor only turn up at the end of stage 2 when it seems like it should have turned up at the end of stage 1? I've had this on the to-do list for quite awhile without getting to it, but since factors have been missed in both stage 1 and stage 2, I have wondered if it could be a GCD problem instead.
This particular exponent requires a lowm.txt file of over 400 entries, so perhaps it isn't a problem that normally arises with typical (prime) exponents, but I have been wondering about it. In which case, geoff's testing idea might be a good one.

akruppa 2004-05-11 09:28

The factor
M( 19427 )C: 53089736370439
which is 1 (mod 28687) is not being found with B1=2000, B2=200000, either. I didn't find another good test case that's 1 (mod 21493).

It looks almost as if Prime95 skipped a few primes in stage 2. Which ones apparantly depends on the choice of B2 and perhaps B1. Maybe this has something to do with the new duplication-preventing code that was introduced in, afaik, v23. It does pairing of primes similar as described in Montgomery's "Speeding" paper, but also cancels smaller primes p if p|(mD+d) where mD-d is a prime included in stage 2. Perhaps it cancels those small primes a little too eagerly, i.e. even in cases where (mD+-D) ends up not being included in stage 2 after all.

It not too many primes are skipped this way, it's probably not a big deal, but it is unfortunate that smallish primes are lost which have a good probability of dividing the group order.

Alex

Prime95 2004-05-11 21:57

[QUOTE=akruppa]The factor
M( 19427 )C: 53089736370439
which is 1 (mod 28687) is not being found with B1=2000, B2=200000, either. I didn't find another good test case that's 1 (mod 21493).
[/quote]

Input an integer =? 53089736370438
2 * 3 * 15877 * 19427 * 28687

P-1 shouldn't find that factor with B1=2000


All times are UTC. The time now is 09:38.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.