mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > PrimeNet

Reply
 
Thread Tools
Old 2011-10-23, 04:39   #716
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

3·29·83 Posts
Default

Quote:
Originally Posted by Mr. P-1 View Post
That's the minimum recommended, not the minimum needed to do any stage 2 at all, right?

Given that half of all exponents never get any stage 2 at all, a machine with 300MB available (the minimum needed to get a P-1 assignment) will on average do a better P-1 than if this task were left to the LL-testing machine.
Processing only 8 relative primes is not very many, so I wouldn't expect 300MB to do very good Stage 2, if at all. OTOH, I can't say what the average LL machine is like, though guessing at something like curtisc might have, I don't think the average college workstation has too much memory, so you're probably right.
Dubslow is offline   Reply With Quote
Old 2011-10-23, 10:48   #717
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

13×277 Posts
Default

If the machine had insufficient memory to do any stage2 at all, it would (using the M54952927 example from above) start with bounds where B1=B2, scaled to a lower overall effort than if stage2 were being done to maintain the balance of no-stage2 => lower factor probability => worth spending less effort:
Quote:
Optimal P-1 factoring of M54952927 using up to 100MB of memory.
Assuming no factors below 2^71 and 2 primality tests saved if a factor is found.
Optimal bounds are B1=735000, B2=735000
Chance of finding a factor is an estimated 2.42%
effort = 2.26GHz-days

Last fiddled with by James Heinrich on 2011-10-23 at 10:58
James Heinrich is offline   Reply With Quote
Old 2011-10-23, 14:46   #718
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7×167 Posts
Default

Quote:
Originally Posted by James Heinrich View Post
If the machine had insufficient memory to do any stage2 at all, it would (using the M54952927 example from above) start with bounds where B1=B2, scaled to a lower overall effort than if stage2 were being done to maintain the balance of no-stage2 => lower factor probability => worth spending less effort:
Here's what I get with various memory settings for an exponent TFed to 68 at varying available memory levels:

Quote:
[Worker #1 Oct 23 15:05] Optimal P-1 factoring of M46221811 using up to 1050MB of memory.
[Worker #1 Oct 23 15:05] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:05] Optimal bounds are B1=555000, B2=14013750
[Worker #1 Oct 23 15:05] Chance of finding a factor is an estimated 6.34%

[Worker #1 Oct 23 15:08] Optimal P-1 factoring of M46221811 using up to 800MB of memory.
[Worker #1 Oct 23 15:08] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:08] Optimal bounds are B1=555000, B2=13597500
[Worker #1 Oct 23 15:08] Chance of finding a factor is an estimated 6.3%

[Worker #1 Oct 23 15:09] Optimal P-1 factoring of M46221811 using up to 500MB of memory.
[Worker #1 Oct 23 15:09] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:09] Optimal bounds are B1=545000, B2=12398750
[Worker #1 Oct 23 15:09] Chance of finding a factor is an estimated 6.18%

[Worker #1 Oct 23 15:11] Optimal P-1 factoring of M46221811 using up to 300MB of memory.
[Worker #1 Oct 23 15:11] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:11] Optimal bounds are B1=530000, B2=10070000
[Worker #1 Oct 23 15:11] Chance of finding a factor is an estimated 5.93%

[Worker #1 Oct 23 15:11] Optimal P-1 factoring of M46221811 using up to 200MB of memory.
[Worker #1 Oct 23 15:11] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:11] Optimal bounds are B1=510000, B2=6885000
[Worker #1 Oct 23 15:11] Chance of finding a factor is an estimated 5.5%
[Worker #1 Oct 23 15:12] Using Core2 type-3 FFT length 2560K, Pass1=640, Pass2=4K
[Worker #1 Oct 23 15:12] M46221811 stage 1 is 94.09% complete.

[Worker #1 Oct 23 15:12] Optimal P-1 factoring of M46221811 using up to 150MB of memory.
[Worker #1 Oct 23 15:12] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:12] Optimal bounds are B1=500000, B2=4750000
[Worker #1 Oct 23 15:12] Chance of finding a factor is an estimated 5.11%

[Worker #1 Oct 23 15:13] Optimal P-1 factoring of M46221811 using up to 100MB of memory.
[Worker #1 Oct 23 15:13] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:13] Optimal bounds are B1=460000, B2=2185000
[Worker #1 Oct 23 15:13] Chance of finding a factor is an estimated 4.31%

[Worker #1 Oct 23 15:18] Optimal P-1 factoring of M46221811 using up to 90MB of memory.
[Worker #1 Oct 23 15:18] Assuming no factors below 2^68 and 2 primality tests saved if a factor is found.
[Worker #1 Oct 23 15:18] Optimal bounds are B1=795000, B2=795000
[Worker #1 Oct 23 15:18] Chance of finding a factor is an estimated 3.38%
In fact stage 2 is possible on this exponent with a minimum of 92MB.

About a year or so ago, I tried the experiment of seeing how many relative primes were processed each pass of stage 2 on minimal memory settings. The answer was 2 out of 8 total. The total number of passes, therefore, are 4, not the 24 it would take if there were 48 relative primes in total, or the horrendous 240 to do 480 relative primes.

This experiment was done on a previous version of mprime. I'll try to catch this exponent just before the end of its stage 1, take a copy of the save file, then complete stage 1 with 92M available in order to repeat the experiment.

Last fiddled with by Mr. P-1 on 2011-10-23 at 14:47
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 15:35   #719
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7×167 Posts
Default

Quote:
Originally Posted by Dubslow View Post
Processing only 8 relative primes is not very many, so I wouldn't expect 300MB to do very good Stage 2, if at all.
As you can see from my previous post, a 300M P-1 on a 46M exponent isn't too bad, and it wouldn't be that much worse on 55M exponent. The Stage 2 code has a variety of plans to chose from, and some of these are optimised for memory restricted scenarios. More is better, but less isn't necessarily terrible.

Quote:
OTOH, I can't say what the average LL machine is like, though guessing at something like curtisc might have, I don't think the average college workstation has too much memory, so you're probably right.
It's not the amount of memory the workstation has that matters, but the amount that the client is configured to use. I would imagine that for most institutions which allow the client to be installed, minimizing the impact upon performance is a priority.

We don't have to guess, however. We can see for ourselves.
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 16:09   #720
garo
 
garo's Avatar
 
Aug 2002
Termonfeckin, IE

ACD16 Posts
Default

Thanks for those numbers Mr. P-1. I would say that even 200MB would give you a decent P-1. After 300MB the marginal gains really do shrink.
garo is offline   Reply With Quote
Old 2011-10-23, 16:34   #721
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

22·1,613 Posts
Default

This is interesting data, it would be even more interesting if we could see the 'effort=' lines that James posted for the different memory settings that Mr P-1 posted.

Code:
100M: 2.42% from 2.26 GHz-days = one factor per 93.39 GHz-days
300M: 4.19% from 2.76 GHz-days = one factor per 65.87 GHz-days
10000M: 4.74% from 3.92 GHz-days = one factor per 82.70 GHz-days
so it would be interesting to see what the percentage and effort marks were for (say) 200M, 400M, 600M.

Is it possible to do ECM on these large exponents?
fivemack is offline   Reply With Quote
Old 2011-10-23, 17:41   #722
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7×167 Posts
Default

Quote:
Originally Posted by Mr. P-1 View Post
Given that half of all exponents never get any stage 2 at all...
That statement used to be true. Now that the LL wavefront has now reached the point where P-1 assignments first started, I'm not sure it still is.

What certainly still is true, that of those assignments going to LL machines without having been Pre-P-1ed, no more than about half are getting any stage two.
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 17:51   #723
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7·167 Posts
Default

Quote:
Originally Posted by fivemack View Post
This is interesting data, it would be even more interesting if we could see the 'effort=' lines that James posted for the different memory settings that Mr P-1 posted.

Code:
100M: 2.42% from 2.26 GHz-days = one factor per 93.39 GHz-days
300M: 4.19% from 2.76 GHz-days = one factor per 65.87 GHz-days
10000M: 4.74% from 3.92 GHz-days = one factor per 82.70 GHz-days
The relevant metric here, surely, is not "factors per GHz-Days" but "expected GHz-Days saved by running this P-1".

Quote:
Is it possible to do ECM on these large exponents?
I don't see why not. My understanding is that the reason we don't do ECM (or P+1) isn't because we can't, but because it's not cost efficient.
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 18:22   #724
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

49116 Posts
Default

Quote:
Originally Posted by garo View Post
Thanks for those numbers Mr. P-1. I would say that even 200MB would give you a decent P-1. After 300MB the marginal gains really do shrink.
The percentage success rate is not the whole story. You also need to take into account the running time. For example, I would guess that B1=460000, B2=2185000 is cheaper to run, even with just 100MB, than B1=B2=795000.
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 21:29   #725
Mr. P-1
 
Mr. P-1's Avatar
 
Jun 2003

7×167 Posts
Default

Quote:
Originally Posted by Mr. P-1 View Post
This experiment was done on a previous version of mprime. I'll try to catch this exponent just before the end of its stage 1, take a copy of the save file, then complete stage 1 with 92M available in order to repeat the experiment.
This is interesting, with as little as 92MB available, mprime chooses bounds with B2 > B1. However, when it comes to actually doing stage 2, it declares "other threads are using lots of memory now" and moves on to the next assignment.

I can't get it to start stage 2 with any less that 112MB. With 112MB, it uses 92M to process 1 relative prime out of 48. However with 113MB available, it uses 112MB to process 2 relative primes out of 48. It looks to me as though there are 2, possibly three, separate bugs here.

Bug 1: Presumably the intention is that stage 2 will only be run when there is sufficient memory to process 2 relative primes, however there appears to be an off by one error in handling the case where the available memory is exactly enough to process 2 relative primes. It starts stage 2, but then only processes 1 relative prime.

Bug 2: When calculating optimal bounds, when deciding whether or not it can do stage 2 at all, it assumes it can if it has enough memory to process only one relative prime, not two. This is a significant bug. Anyone allowing exactly 100MB will, for exponents of this size, accumulate unfinished P-1 save files without ever completing them.

Possible Bug 3: In earlier versions, I'm sure I recall it choosing plans with just 8 relative primes in total. Shouldn't it have chosen such a plan here?

There are two other minor "output" bugs. When re-starting stage 2, it reports that instead of the optimal bounds it calculates, it is "using B1=560000 from the save file". 560000 is the B1 bound it computed based upon the generous memory allocation at the very start of its stage 1 calculation. However stage 1 was finished with a much lower memory allocation, and consequently a much lower optimal B1, but it never told me during stage 1 that it was using B1 from the save file.

Finally the message "other threads are using lots of memory now" is confusing when you have no other threads running.

Linux,Prime95,v26.5,build 5. I will PM George to draw his attention to this post.

Last fiddled with by Mr. P-1 on 2011-10-23 at 21:39
Mr. P-1 is offline   Reply With Quote
Old 2011-10-23, 22:16   #726
bcp19
 
bcp19's Avatar
 
Oct 2011

7·97 Posts
Default

Quote:
Originally Posted by Mr. P-1 View Post
Bug 1: Presumably the intention is that stage 2 will only be run when there is sufficient memory to process 2 relative primes, however there appears to be an off by one error in handling the case where the available memory is exactly enough to process 2 relative primes. It starts stage 2, but then only processes 1 relative prime.
I thought P-1 was only supposed to use 90% of available memory, which would kinda explain why at 112MB available it uses 92M, but doesn't explain the use of 112MB at 113MB available...
bcp19 is offline   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 06:42.


Fri Jan 28 06:42:44 UTC 2022 up 189 days, 1:11, 2 users, load averages: 1.80, 1.57, 1.49

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔