mersenneforum.org > Data Dual core anyone?
 User Name Remember Me? Password
 Register FAQ Search Today's Posts Mark Forums Read

2006-09-19, 12:17   #67
James Heinrich

"James Heinrich"
May 2004
ex-Northern Ontario

1101001000012 Posts

Quote:
 Originally Posted by Prime95 You are a rather special user. The average Joe will just set it to something reasonable (like 200MB out of 1GB) and forget it.
Could you please explain (briefly) what benefit more allocated memory gives? It says something about processing more relative primes when more memory is available. But what does it mean? I assume there's no greater chance of finding a factor, so it would have to be a speed improvement? How much better is it to process 20 relative primes compared to 10, for example? Basically, what do I get out of allocating 1500MB to P-1 vs just 500MB?

2006-09-19, 13:35   #68
Prime95
P90 years forever!

Aug 2002
Yeehaw, FL

22×1,873 Posts

Quote:
 Originally Posted by James Heinrich Could you please explain (briefly) what benefit more allocated memory gives? It says something about processing more relative primes when more memory is available. But what does it mean? I assume there's no greater chance of finding a factor, so it would have to be a speed improvement? How much better is it to process 20 relative primes compared to 10, for example? Basically, what do I get out of allocating 1500MB to P-1 vs just 500MB?
In short, there is probably very little benefit to allocating 1500MB over 500MB.

The extra memory helps in several ways: As you noticed it processes more relative primes at a time. If there are 480 to process and process 20 at a time, then it takes 24 passes vs 48 passes if only 10 are processed at a time. There is a small fixed cost to do a pass so you save 24 times this fixed cost.

Extra memory also helps by allowing prime95 to choose a 2nd pass that uses more relative primes. Prime95 processes from B1 to B2 in groups of 30 (2*3*5) or 210 (2*3*5*7) or 2310 (2*3*5*7*11). There are 8, 48, or 480 relative primes respectively. The bigger the group, the less fixed cost in counting from B1 to B2.

We see above that using lots of memory, reduces the cost of processing from B1 to B2 a little bit. Therefore, it may pay to increase the B2 bounds a little bit to increase the chance of finding a factor.

Finally, if you have lots of memory prime95 uses Suyama's trick to also include some factors above B2 at a small cost. It is an open issue whether extra small chance of finding a factor is worh this small cost.

The above isn't real helpful. In short, you get a real big benefit when you go from a tight memory situation to a decent memory allocation. You get a tiny gain going from a decent memory allocation to a generous one.

Last fiddled with by Prime95 on 2006-09-19 at 13:38

2006-09-19, 14:31   #69
drew

Jun 2005

2·191 Posts

Quote:
 Originally Posted by Prime95 In short, there is probably very little benefit to allocating 1500MB over 500MB.
Hi George,

Is there an easy way to determine where the knee is in the cost vs. benefit curve? I assume it depends on the size of the exponent.

I want to test things as effectively as I can, but I must say that P-1 stage 2 is the only part of Prime95 that can be intrusive at times, such as when I run another memory-intensive task. Perhaps there can be a 'smart memory' setting that will determine the most reasonable value as long as it's below the allocated memory setting.

Drew

 2006-09-19, 17:44 #70 Andi47     Oct 2004 Austria 248210 Posts Can Prime95 version 25 handle bigger B2 values than 4290M (for ECM and P-1)?
2006-09-19, 19:21   #71
Prime95
P90 years forever!

Aug 2002
Yeehaw, FL

165048 Posts

Quote:
 Originally Posted by Andi47 Can Prime95 version 25 handle bigger B2 values than 4290M (for ECM and P-1)?
No. Use GMP-ECM - which has a much better stage 2 algorithm

2006-09-20, 01:32   #72
James Heinrich

"James Heinrich"
May 2004
ex-Northern Ontario

3,361 Posts

Quote:
 Originally Posted by Prime95 In short, you get a real big benefit when you go from a tight memory situation to a decent memory allocation. You get a tiny gain going from a decent memory allocation to a generous one.
As drew said, the question then becomes how to figure out what qualifies as "tight", "decent" and "generous" memory amounts for a given exponent... Could you throw out a few numbers so that we can graph tight/decent/generous vs exponent, please?

2006-09-20, 04:22   #73
Andi47

Oct 2004
Austria

2·17·73 Posts

Quote:
 Originally Posted by Prime95 No. Use GMP-ECM - which has a much better stage 2 algorithm
Is it possible that I do P-1 stage 1 only with Prime95? Does it work the same way as for ECM?

2006-09-20, 13:17   #74
Prime95
P90 years forever!

Aug 2002
Yeehaw, FL

22×1,873 Posts

Quote:
 Originally Posted by drew Is there an easy way to determine where the knee is in the cost vs. benefit curve?
No. The optimal bounds code uses trial and error, testing dozens of different B1/B2 combinations to pick the best one. You'll need to do the same thing, trying several different memory sizes to get what you think is the best balance between run-time and chance of finding a factor.

I suspect the "knee" is somewhere around 13 temporary variables. This lets prime95 run the 2*3*5 case in one pass over the 8 relative primes. So for a 2048K FFT you would need roughly 13 * 2048K * 8 (sizeof of a float) + 10M (rough guess of sin/cos and other needed data). That is 218MB. Do some real world tests to see if my suspicion is correct.

2006-09-20, 13:18   #75
Prime95
P90 years forever!

Aug 2002
Yeehaw, FL

22·1,873 Posts

Quote:
 Originally Posted by Andi47 Is it possible that I do P-1 stage 1 only with Prime95? Does it work the same way as for ECM?
Check in the factoring forum for how to use prime95 to do stage 1 and GMP-ECM to do stage 2. If it isn't easily found ask someone to make a sticky.

 2006-11-11, 20:30 #76 James Heinrich     "James Heinrich" May 2004 ex-Northern Ontario 3,361 Posts I've feature-requested this elsewhere before, but it may fit even better in with this multi-threaded version: Similar to the PauseWhileRunning setting, I'd like to see a LowMemoryUseWhileRunning setting which would limit Prime95 to only low-memory tasks, such as P-1.stage1, LL, TF. I find that Prime95's CPU usage is not an issue for me (idle priority seems to work very well), but the disk thrashing when loading up a memory-hogging program (Adobe*, games, etc) is unacceptable, and I need to either stop Prime95 completely or set the min/max memory settings to 8MB or somesuch to force Prime95 not to do any P-1.stage2.

 Similar Threads Thread Thread Starter Forum Replies Last Post Rodrigo PrimeNet 4 2011-07-30 14:43 Rodrigo Hardware 6 2010-11-29 18:48 xorbe PrimeNet 4 2009-04-04 15:32 patrik Hardware 3 2007-01-07 09:26 R.D. Silverman Hardware 12 2005-02-20 21:46

All times are UTC. The time now is 08:42.

Sat May 15 08:42:07 UTC 2021 up 37 days, 3:22, 0 users, load averages: 1.47, 1.68, 1.69