20130513, 18:18  #1 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
1111010000000_{2} Posts 
Calculating optimal P1 memory
I searched around a while back and did not see an answer for this...
I know that the more memory that can be given to P1 stage 2, the better. What I was trying to figure or find, is there a formula or tool somewhere to: calculate how many relative primes P1 will process for a given set of parameters. On one machine I noticed that if I went from one amount of memory to another the number of primes jumped disproportionately. I would love to find out where the various break points are for various assignments. Then I could tune memory settings to be livable and productive. Also, if I knew that I could add 50 or 100 MB to the night setting and get a boost in the number of primes, that would be great. 
20130513, 18:58  #2 
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
4267_{10} Posts 
From http://www.mersennewiki.org/index.ph...yama_extension, it seems to me that the number of relative primes is due to the BrentSuyama extension e used, but I don't know exactly what memory equates to what e value.

20130513, 19:51  #3 
"Kieren, ktony"
Jul 2011
3^{2}·1,061 Posts 
I have been puzzled by this, too. Until recently, I had been allowing P95 to use 27000MB. Rather than doing 480 RP in a single pass, it would mostly do multipass on 960 RP. A change in memory usage led me to reduce the P95 allocation to 25500MB. Now it only does 960 RP occasionally, but it is still doing multipass at 480 RP. It seems that somewhere in there it would find it possible to do Stage 2 in a single pass.

20130515, 21:42  #4  
Jun 2003
7·167 Posts 
Quote:
Quote:
No because, as the wiki page says, the number of relative primes is computed from a different parameter, denoted by d, not from the e value. Yes because the program tries to optimise the choice of parameters, including d and e, subject to the maximum memory available. For fixed d, a higher value of e requires slightly more memory, so it can happen that there is enough memory available for a higher e, or a larger d, but not both. Quote:
Specifically what the program does is choose d to be a small primorial (30, 210 or 2310) or a multiple of one of these values smaller than the next primorial. The number of relative primes is then 8, 48, or 480, or the corresponding multiple of one of these. But I don't understand why it would ever choose a multiple (other than the next primorial up). Apparently if it can manage to do so, then there is a slight improvement in speed, but I cannot for the life of me think of a reason why this should be so. Quote:


20130515, 22:10  #5  
"Carl Darby"
Oct 2012
Spring Mountains, Nevada
100111011_{2} Posts 
Quote:
The next primorial gives a rather dramatic increase in the number of relative primes, so sometimes the initialization costs for the increased number of passes outweighs the other advantages of the larger primorial. Clear as mud? Last fiddled with by owftheevil on 20130515 at 22:27 Reason: Left out the base 

20130515, 23:29  #6 
"Kieren, ktony"
Jul 2011
3^{2}×1,061 Posts 
Thanks to Mr. P1 and owftheevil for the information. I get at least a vague sense of what's going on. I can see that I left out a key variable in my previous account: the number of HighMem workers. At the moment this is five, which means that any particular run never comes close to having 12 gigabytes. They rarely exceed 6 GB. I do consistently come in at E=12, however.
Even with a total of 32 GB I can't really lock in 12 GB per worker without having to intervene once in a while to let the Stage 2's catch up, and I'm too lazy to mess with things that much. 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Calculating E based on B1  c10ck3r  Math  1  20120222 06:29 
Optimal ECM bounds  henryzz  GMPECM  14  20110609 17:04 
optimal B1  MiniGeek  Factoring  4  20110527 12:19 
Calculating a difficult sum  CRGreathouse  Math  3  20090825 14:11 
optimal memory settings for the P1 stage  S485122  Software  16  20070528 12:08 