![]() |
[QUOTE=c10ck3r;288587]is there a GPU program for P-1, or is it still relegated to CPUs with loads of RAM?[/quote]Currently limited to CPUs. A GPU-based P-1 program is very desirable, but not yet in existence. But perhaps this will be the year it shows up (e.g. [i]flashjh[/i] mentioned an interest on working on the problem a few posts up).
[QUOTE=c10ck3r;288587]On a related note, is it possible to get P-1 with B1 less than 75000?[/QUOTE]A P-1 can be done (with more-or-less) any bounds, but that doesn't mean it's a good idea. P-1 assignments in the current range (around 50M) will have bounds in the order of roughly B1=500,000; B2=10,000,000 which is a far cry from B1=75,000. Why that particular number for B1? That would've been selected as a typical B1 when working on assignments in the 5M range. |
Chalsall,
I was wondering, seeing as we are starting to 'knock on the door' of DC exp's to take to ^70, is it possible to set up a check box to get 'default' limits so if there happen to be any ^69 exp's they are picked up as ^69 and the ^70's as ^70's? |
[QUOTE=KyleAskine;288581]So much for trying to fight for #3.
One of my PC's (w/ HD5870) was crashed the entire weekend, so I lost around half of my throughput since the last time I updated. :no:[/QUOTE] Did you lose your results? If you need data recovery, maybe I can help. |
Bad Rationale
[QUOTE=James Heinrich;288588] P-1 assignments in the current range (around 50M) will have bounds in the order of roughly B1=500,000; B2=10,000,000 which is a far cry from B1=75,000. Why that particular number for B1? That would've been selected as a typical B1 when working on assignments in the 5M range.[/QUOTE]
I was looking to do short-term P-1 assignments in, say, the 1XXM exponent range, with my...less than desirable memory (2GB installed). Next question-Would running B1=B2 get rid of stage 2, thus not being so RAM-intensive? If so, what would the ideal B1 be for the 50M range or so? |
B1=B2 means no stage 2, yes. On the other hand, depending on what OS/programs you're running, if you had only one P-1 worker and gave it like 1200-1500MB, that would be more than enough to do a decent P-1 (much better than B1=B2, even with increased B1).
It is my personal opinion that B1=B2 runs are not worth it. If GIMPS somehow ever pulls ahead of the LL wave with P-1 (yeah right) the next thing I'm doing is going back and redoing all those. |
[QUOTE=c10ck3r;288596]I was looking to do short-term P-1 assignments in, say, the 1XXM exponent range, with my...less than desirable memory (2GB installed).[/quote]If I may suggest: don't. If you do it with poor bounds, there's a reasonable chance that someone may either a) waste time running L-L on an exponent that should've already had a factor found; or b) waste time re-running the P-1 with proper bounds. Running P-1 with minimal memory is one thing at the current wavefront, it will inevitably happen; running it years ahead of the wavefront on knowingly suboptimal hardware is unwise.
Running [URL=http://mersenne-aries.sili.net/prob.php?exponent=100000000&guess_saved_tests=2]P-1 on M100,000,000[/URL] should require a minimum of around 900MB of RAM allocated to Prime95 (but would prefer 2GB-20GB). [quote]Next question-Would running B1=B2 get rid of stage 2, thus not being so RAM-intensive?[/quote]Yes. It will also drop the factor probability from ~5% to ~3% for the same runtime effort (meaning your P-1 factoring is only 60% as efficient as letting it run stage2). [quote]If so, what would the ideal B1 be for the 50M range or so?[/quote]Whatever Prime95 picks with as much RAM as you can let it have. (Seriously, it's a complex iterative calculation to balance probability vs effort breakevens). Even 500MB is acceptable for a P-1 in the 50M range, and highly preferable to a B1=B2 run. For example: 50M with 500MB allocated gets you [URL=http://mersenne-aries.sili.net/prob.php?exponent=50000000&guess_saved_tests=2&factorbits=72]4.21% at 3.37GHz-days[/URL]. but 50M with B1=B2 gets you [URL=http://mersenne-aries.sili.net/prob.php?exponent=50000000&work=3.371327&factorbits=72&b1only=1]2.63% at 3.37GHz-days[/URL] (same effort, lower probability) or 50M with B1=B2 gets you [URL=http://mersenne-aries.sili.net/prob.php?exponent=50000000&prob=4.212662&factorbits=72&b1only=1]4.21% at 13.80GHz-days[/URL] (same probability, much higher effort) |
[QUOTE=KyleAskine;288581]So much for trying to fight for #3.
One of my PC's (w/ HD5870) was crashed the entire weekend, so I lost around half of my throughput since the last time I updated. :no:[/QUOTE] I offer sympathy for unexpected occurrences. Just remember, This Too Shall Pass, in the long run. This race (if you want to call it that) is far from over. |
[QUOTE=James Heinrich;288601]
For example: 50M with 500MB allocated gets you [URL="http://mersenne-aries.sili.net/prob.php?exponent=50000000&guess_saved_tests=2&factorbits=72"]4.21% at 3.37GHz-days[/URL]. but 50M with B1=B2 gets you [URL="http://mersenne-aries.sili.net/prob.php?exponent=50000000&work=3.371327&factorbits=72&b1only=1"]2.63% at 3.37GHz-days[/URL] (same effort, lower probability) or 50M with B1=B2 gets you [URL="http://mersenne-aries.sili.net/prob.php?exponent=50000000&prob=4.212662&factorbits=72&b1only=1"]4.21% at 13.80GHz-days[/URL] (same probability, much higher effort)[/QUOTE] I don't understand the last one? |
[QUOTE=flashjh;288606]I don't understand the last one?[/QUOTE]
I don't understand your question? |
[QUOTE=Dubslow;288608]I don't understand your question?[/QUOTE]
I figured it out... I wasn't looking at the links. B1=B1 with two different results, but the links explain it. |
[QUOTE=oswald;288591]Did you lose your results? If you need data recovery, maybe I can help.[/QUOTE]
Oh no, it just hard locked (as Linux tends to do when the video driver explodes). All hardware is fine. I got a touch too aggressive with my O/C, and I didn't watch it long enough to see if it was fine. |
| All times are UTC. The time now is 22:59. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.