![]() |
|
|
#232 | |||||
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
170148 Posts |
Quote:
The server has usually considered any P-1 enough to satisfy the "Has it been P-1ed?" question. At least once, there was a cleaning-up of cases with ultra-low B1, such as [30,something], so that they were re-designated as not having had P-1 done. Quote:
Also, they probably didn't realize that reporting unsuccessful runs with such low limits would, given the server's behavior, prevent someone else from being assigned to do P-1 to optimum limits calculated by prime95 or mprime, thus reducing GIMPS throughput. Quote:
Quote:
Quote:
Last fiddled with by cheesehead on 2009-07-15 at 00:30 |
|||||
|
|
|
|
|
#233 |
|
Aug 2002
Termonfeckin, IE
22×691 Posts |
As Kevin said in his deleted post - yes the chosen few can read deleted posts
- "I wouldn't worry about other people's B1 and B2 limits and trust Prime95 to do its thing as long as you have at least 300MB available for P-1".I once wrote a long monograph on P-1 for Seventeen or Bust but the basic principles are applicable in GIMPS too. http://www.sslug.dk/~grove/sbfactor/choosing_bounds.html Prime95 has a sophisticated algorithm to compute the P-1 bounds that look at the exponent size, the number of tests saved if a factor is found and available memory. The last factor has the least influence as long as it is above a minimum. Note also that once P-1 has been done to "lower than optimal" limits, it is typically not worth redoing it to optimal limits as the additional chance of finding a factor is not enough to justify redoing the P-1. However, absurdly small limits such as B1=2000,B2=20000 do make a retest justifiable. If you are interested in redoing P-1 for exponents where you think the bounds were not sufficient, I would look at the average bounds of surrounding exponents and then pick any exponent whose bounds were say 1/10 of the average. You can also look at the excellent calculator at: http://mersenne-aries.sili.net/prob.php to find the probability of finding a factor. Last fiddled with by garo on 2009-07-16 at 09:57 Reason: Link to the mersennaries calculator |
|
|
|
|
|
#234 | |
|
"Kyle"
Feb 2005
Somewhere near M52..
3×5×61 Posts |
Quote:
Last fiddled with by garo on 2009-07-16 at 10:19 |
|
|
|
|
|
|
#235 | |
|
"Brian"
Jul 2007
The Netherlands
7×467 Posts |
Quote:
I have limited the memory use by mprime to 250M because the machine is only switched on when I am using it and I want the bulk of my 1 gigabyte for my own use. But if I increased the memory allowance for mprime slightly, would this make the server give me the much-required P-1 work? Or is my machine unsuitable for it anyway? Last fiddled with by Brian-E on 2009-07-15 at 15:50 |
|
|
|
|
|
|
#236 | |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
17·251 Posts |
Quote:
From Prime95's readme: Code:
4) Factor in the information below about minimum, reasonable, and
desirable memory amounts for some sample exponents. If you choose a
value below the minimum, that is OK. The program will simply skip
stage 2 of P-1 factoring.
Exponent Minimum Reasonable Desirable
-------- ------- ---------- ---------
20000000 40MB 80MB 120MB
33000000 65MB 125MB 185MB
50000000 85MB 170MB 250MB
Last fiddled with by Mini-Geek on 2009-07-15 at 16:18 |
|
|
|
|
|
|
#237 |
|
"Brian"
Jul 2007
The Netherlands
63058 Posts |
Yes, I originally set it to 250M (2 years ago) on the basis of that file. But I read in this very recent posting from garo that 300M may be required.
Also if P-1 is never given if you select "do what makes sense", I must ask why. I deliberately chose that preference because I want to do whatever the project most requires. If I should really re-set that to a preference for P-1 then I will do so, but I find that strange. Thanks for your reply.
|
|
|
|
|
|
#238 | |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
17·251 Posts |
That seems quite odd to me, too. Maybe I'm mistaken, I'll look it up at PrimeNet.
http://www.mersenne.org/thresholds/ Quote:
Code:
8MB: Prime95: 3.57% page: M50766601, factored to 68 bits, with B1=895,000 and B2=895,000 Probability = 4.11268% Should take about 2.75 GHz-days 250MB: Prime95: 5.98% page: M50766601, factored to 68 bits, with B1=575,000 and B2=9,343,750 Probability = 6.13057% Should take about 3.24 GHz-days 300MB: Prime95: 6.19% page: M50766601, factored to 68 bits, with B1=590,000 and B2=11,062,500 Probability = 6.36555% Should take about 3.57 GHz-days Last fiddled with by Mini-Geek on 2009-07-15 at 16:58 |
|
|
|
|
|
|
#239 |
|
P90 years forever!
Aug 2002
Yeehaw, FL
7,537 Posts |
IMO, if you enjoy finding factors do P-1 on small exponents. Choose B2 as roughly 20*B1 so that it spends an equal amount of time in stage 1 and stage 2.
I'd say that P-1 and double-checking are both short-handed. TF definitely has too many CPUs. "Do what makes the most sense" allows me to change the server's rules for handing out assignments, which I may do someday. At present, I'm inclined to let first-time LL testers do the P-1 testing that those dedicated solely to P-1 don't get to. Yeah, the LL testers may not have enough memory to run stage 2, so we won't find quite as many factors. Another choice would be to divert "do what makes the most sense" machines with lots of memory to P-1 half-time or full-time -- and I think P-1 would still fall behind the LL testers. I'd probably define "lots of memory" as 400 or 500 MB/core. |
|
|
|
|
|
#240 |
|
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
22×3×17×23 Posts |
at first blush it appears that P-1 is keeping up with LL ... but it is NOT.
From April 27 - July 8 I simply counted ALL LL and P-1 Attempts: Code:
P1 LL
27-Apr 56,573 792,600
8-Jul 97,712 815,864
Diff. 41,139 23,264
P1 - LL 17,875
There are at least a couple people doing P1-S (small) that account for in my estimation at least 27,000 attempts. I am counting those people whose Points Per Attempt is significantly below the expected: There are over 27,000 attempts for people averaging less than 0.25 points per attempt; 1 person has 20,568. And, yes some of the LL are in the low-end clean-up range too (25-33M) but only a couple thousand in that same time period. Some of these 27,000 might be P1 to very low B1/B2 but the analysis suggests that, by far, the majority of these are P1 on small exponents. SO ... THE PROJECT STILL NEEDS MORE P-1 IN THE CURRENT FIRST TIME LL RANGE.
Last fiddled with by petrw1 on 2009-07-15 at 17:46 |
|
|
|
|
|
#241 |
|
"Mark"
Feb 2003
Sydney
23D16 Posts |
|
|
|
|
|
|
#242 | |
|
"Kyle"
Feb 2005
Somewhere near M52..
3×5×61 Posts |
Quote:
|
|
|
|
|