2007-01-26, 18:43 | #1 |
Sep 2006
Brussels, Belgium
11001101111_{2} Posts |
E=6 in P-1 result line
When P-1 trial factoring numbers with a machine that has 3GB of memory I get the following result lines :
UID: S485122/Q67-W3, M37377601 completed P-1, B1=420000, B2=10815000, E=6, Wc1: C65CCB1A What is the meaning of the "E=6" part ? I did get it once on another machine previously but most of the previous P-1 tests did not have that part. Jacob |
2007-01-26, 18:52 | #2 |
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Iirc, E is the exponent for the Brent-Suyama extension. In stage 2, P-1 tests if any x^{[I]f[/I]([I]m[/I])}-x^{[I]f[/I]([I]n[/I])} for certain m, n pairs has a factor in common with N, the number to be factored. Here, x is the residue from stage 1.
There is some freedom in what function to choose for f(x), but for P-1 the most common is a simple power, f(x)=x^{2} or f(x)=x^{6} or so. Higher powers take more time, but increase the probability of finding a factor a little. Alex |
2007-01-26, 20:15 | #3 |
Sep 2006
Brussels, Belgium
3^{3}×61 Posts |
If I understand well "E=6" means the program has tested if x^{m[sup]6}[/sup]-x^{n[sup]6}[/sup] has a factor in common with the exponent to be tested for certain m,n pairs ?
I tried to look in the source to find what that "E=" meant before iniiating this thread, but my knowledge of C and mathematics is not what it should be :-( In the P-1 file on ftp://mersenne.org/gimps/pminus1.zip there is no mention of the "E..." results. I also saw in that file that some exponents have been P-1'ed with enormous B1 and B2 bounds and others with very little B1 bounds. Would it be worthwhile to P-1 sieve the exponents that have not been tested much, after cheking that no factor has been found by "classic" trial factorisation of course ? |
2007-01-26, 22:03 | #4 | ||
"Nancy"
Aug 2002
Alexandria
2,467 Posts |
Quote:
Quote:
Alex |
||
2007-01-28, 04:02 | #5 |
"Richard B. Woods"
Aug 2002
Wisconsin USA
2^{2}×3×641 Posts |
While doing some independent P-1 factoring the past several years, I've saved copies of the Pminus1.txt files and regularly scanned them out of curiosity for what I could deduce.
Prime95 did not have the P-1 function added to it and made a mandatory prerequisite to L-L tests until first-time LL assignments were in the 10xxxxxx range and DC assignments were roughly 4xxxxxx. (One can deduce the latter without looking up contemporary records by comparing the patterns of B1/B2 accomplishment for various ranges in the Pminus1.txt file.) So almost no exponents below 4xxxxxx had P-1 done before either first or second L-L tests. (There are a few exceptions at exponents that were assigned for third/fourth/fifth LL testing after mismatching residues.) Instead, they were P-1ed independently by various persons. One can see systematic patterns in the B1/B2 that reflects this. For a while I saw that someone went through all the small exponents doing B1=B2=1000 for all exponents that had no previous P-1, then later B1=B2=2000 (perhaps on a bunch of old 486es!). Then the same or other person returned to raise bounds to B1=2048,B2=204800. Others did somewhat higher bounds on small stretches of exponents. As of the most recent files update, all exponents below 5451000 have had at least some P-1 done, but above that there are still exponents with L-Ls, but no P-1. Counterintuitively, when Prime95 does P-1 as a prerequisite to L-L tests, the systems with low available memory have a higher B1 calculated by the Prime95 algorithm than do the machines with large available memory. Example from Pminus1.txt: 25093121,370000,370000 25093183,290000,6017500 25093279,295000,7522500 25093487,370000,370000 25093709,370000,370000 25093837,235000,1116250 25093877,260000,2730000 25093933,290000,5147500 25094011,370000,370000 25094123,270000,3442500 The machines with smallest available memory (perhaps the default 8M) were assigned B1=B2=370000 and did only stage 1, not stage 2. For machines with more memory, the Prime95 assignment algorithm calculates that the optimum combination is a somewhat smaller B1 coupled with a greater B2 for the stage 2 which is performed in these cases. My guess is that 25093837 ran on a system with just barely enough memory to justify a stage 2, 25093877 ran on one with slightly more available memory, 25094123 ran on one with the next larger amount, and 25093933, 25093183, and 25093279 ran on systems specifying increasingly larger amounts of available memory. Another example: 11217061,333333,3333333 11217097,65000,65000 11217191,65000,65000 11217233,160000,160000 11217373,130000,1690000 11217389,125000,1343750 11217097 and 11217191 probably had their P-1 done just before the double-check, after their first-time L-Ls were performed by a version of Prime95 that did not yet have the P-1 funtionality. Prime95 always calculates B1 as a multiple of 5000 and B2 as a multiple of 1/4 * B1. So 11217061's bounds of 333333,3333333 were not assigned by Prime95 (though there may have been an earlier P-1 run with lower B1/B2 calculated by Prime95), but specified with a worktodo.ini line like "Pminus1=11217061,333333,3333333,0,0" (P.S. P-1 isn't a "trial" factoring algorithm, either. Just say "P-1 factoring". "Trial factoring" is different.) /end picky-mode. Last fiddled with by cheesehead on 2007-01-28 at 04:20 |
Thread Tools | |
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
International Date Line | a1call | Science & Technology | 27 | 2019-05-15 13:01 |
command line switch | wongnog | Information & Answers | 1 | 2008-07-20 11:29 |
getting and parsing last line (windows) | paulunderwood | Programming | 13 | 2006-12-09 05:49 |
strange line in worktodo.ini | Mini-Geek | Software | 1 | 2006-10-15 18:04 |
Missing last result line | hbock | Lone Mersenne Hunters | 2 | 2004-01-05 11:19 |