mersenneforum.org Run P-1 twice (or 1.5x)?
 Register FAQ Search Today's Posts Mark Forums Read

 2010-12-18, 22:32 #1 Brain     Dec 2009 Peine, Germany 331 Posts Run P-1 twice (or 1.5x)? I sometimes get LL asignments for exponents that have only experienced stage 1 of P-1. I know that PrimeNet chooses a higher B1 value then. Wouldn't it make sense to run only stage 2 again on my PC which offers enough RAM to Prime95? I thought most factors were found in stage 2... Moreover, I cannot get manual assignments or use undoc options to manually do only stage 2... Any thoughts? It's a question of probability. Last fiddled with by Brain on 2010-12-18 at 22:36 Reason: typo
 2010-12-19, 02:04 #2 Mini-Geek Account Deleted     "Tim Sorbera" Aug 2006 San Antonio, TX USA 17×251 Posts To run stage 2, you have to have an end result from stage 1. I think for GIMPS numbers this is several MB. It should be possible to run stage 2 on a different computer from stage 1, but you can't only run stage 2. I think about half of the factors should be found in stage 2 (when it's run). Last fiddled with by Mini-Geek on 2010-12-19 at 02:06
2010-12-19, 03:10   #3
markr

"Mark"
Feb 2003
Sydney

3×191 Posts

Quote:
 Originally Posted by Brain I sometimes get LL asignments for exponents that have only experienced stage 1 of P-1. I know that PrimeNet chooses a higher B1 value then. Wouldn't it make sense to run only stage 2 again on my PC which offers enough RAM to Prime95? I thought most factors were found in stage 2... Moreover, I cannot get manual assignments or use undoc options to manually do only stage 2... Any thoughts? It's a question of probability.
You can get estimates of the probabilities here: http://mersenne-aries.sili.net/prob.php

Consensus is that doing more P-1 is usually not worth it, unless it's a case like 48090437, which had been done to B1=2500, B2=5000.

 2010-12-19, 10:32 #4 imwithid     Apr 2009 Venice, Chased by Jaws 3×29 Posts I have just received my first P-1 test and it has gone to stage 2. I allocate 512MB of memory by default. Upon completion, it stated, Code: [Work thread Dec 19 04:22] Starting stage 1 GCD - please be patient. [Work thread Dec 19 04:24] Stage 1 GCD complete. Time: 112.506 sec. [Work thread Dec 19 04:24] Using 498MB of memory. Processing 19 relative primes (0 of 480 already processed). [Work thread Dec 19 04:33] M47907107 stage 2 is 1.38% complete. Time: 548.246 sec. Would it help or adversely affect the outcome if I bumped it up to 1024 (1GB) of RAM? How do I know the optimal memory allocation for a given test? Are there only two stages to this process (is it like factoring but with extra work?)?
2010-12-19, 11:51   #5
Mr. P-1

Jun 2003

116910 Posts

Quote:
 Originally Posted by imwithid I have just received my first P-1 test and it has gone to stage 2. I allocate 512MB of memory by default. Upon completion, it stated, Code: [Work thread Dec 19 04:22] Starting stage 1 GCD - please be patient. [Work thread Dec 19 04:24] Stage 1 GCD complete. Time: 112.506 sec. [Work thread Dec 19 04:24] Using 498MB of memory. Processing 19 relative primes (0 of 480 already processed). [Work thread Dec 19 04:33] M47907107 stage 2 is 1.38% complete. Time: 548.246 sec. Would it help or adversely affect the outcome if I bumped it up to 1024 (1GB) of RAM?
It will have no effect on the final result, but you will complete the computation a little more quickly.

If you had increased the memory before Stage 1 had completed, it would have recomputed the bounds, resulting in a slightly deeper P-1 test and giving you a slightly greater chance of finding a factor.

Quote:
 How do I know the optimal memory allocation for a given test? Are there only two stages to this process.
There are only two stages, and either is capable of yielding a factor. The rule of thumb for memory is "as much as you can let it have, without thrashing".

Quote:
 (is it like factoring but with extra work?)?
I'm not sure I understand the question. Trial Factorisation will quickly find a small factor, if there is one. P-1 gives a chance of finding a much larger factor. They're "like" each other in the sense that the both find factors, but different in that they use different methods and find different factors.

Last fiddled with by Mr. P-1 on 2010-12-19 at 11:51

2010-12-19, 12:42   #6
markr

"Mark"
Feb 2003
Sydney

57310 Posts

Mr. P-1 beat me to it, with a better answer than mine would have been, but I'll stick my beak in anyway.
Quote:
 Originally Posted by imwithid Would it help or adversely affect the outcome if I bumped it up to 1024 (1GB) of RAM? How do I know the optimal memory allocation for a given test?
Agreed - more is better, if you can.

On the other hand, what's a "reasonable" minimum that lets prime95/mprime work "reasonably" well? Perhaps you regularly use an application that runs more slowly if it has less memory available to it, or maybe you just want to be conservative. Well, apparently 500MB is good, according to this.

Quote:
 Are there only two stages to this process (is it like factoring but with extra work?)?
Yes - two stages, & yes - they're similar in that they both have a chance of finding a factor. The math page has a good, short description of each.

Last fiddled with by markr on 2010-12-19 at 12:53

2010-12-19, 15:54   #7
Mr. P-1

Jun 2003

100100100012 Posts

Quote:
 Originally Posted by markr On the other hand, what's a "reasonable" minimum that lets prime95/mprime work "reasonably" well? Perhaps you regularly use an application that runs more slowly if it has less memory available to it, or maybe you just want to be conservative. Well, apparently 500MB is good...
A reasonable minimum, in my view, is any amount enough to get you a high memory plan for stage two. How much that is depends upon the FFT size, whether you have more than one worker thread, and what the other threads are doing.

You know you have a high memory plan when it says "(xx of 480 already processed)". If there is any number other than 480, then you are on a restricted memory plan, and would do well to increase the memory for the next exponent. (I don't think it would greatly help the one you're on, because you can't change plans midway.)

Beyond that, I try to arrange to do a number of relative primes per pass that either divides 480 exactly or with a large remainder. 19 is a "miss" in this respect because 480/19 = 25 remainder 5.

If imwithid is comfortable committing 1GB, then I suggest aiming at 40 relative primes per pass.

2010-12-19, 16:23   #8
Mr. P-1

Jun 2003

7×167 Posts

Quote:
 Originally Posted by markr The math page has a good, short description of each.
The wiki has a excellent albeit more technical explanation of the P-1 method.

2010-12-19, 17:37   #9

"Richard B. Woods"
Aug 2002
Wisconsin USA

11110000011002 Posts

Quote:
 Originally Posted by imwithid (is it like factoring but with extra work?)?
Just to clear up some terminology:

There are several different methods for factoring (trying to find a factor). Many messages in Prime95 that say "factoring" really should say "trial factoring" to avoid confusion, because they're referring to only that method. In prime95's early stages, trial factoring was the only factoring method built into it; those messages date from then.

Two other factoring methods, P-1 and ECM, have been added to prime95 since then. Messages refer to them as "P-1" and "ECM", without the word "factoring", but they both are factoring methods (and both have two stages).