![]() |
I've been thinking about doing P-1 on a large exponent sometime. Does anyone know what the current Prime95's upper limit is?
|
Interesting; I will consider running a high P-1 one of these days, perhaps. It seems as though we (as a project) are lacking regular P-1 though, so maybe I will just continue onward in the 53M or wherever else I get assigned. My new hex-core machine has two cores doing P-1, my laptop has one core on P-1, and my old P3 1.4GHz is doing P-1 with one CPU because it has ECC RAM. They are all doing default sizes; the hex-core takes roughly 2 days to finish one, my laptop takes a week or two, and my old machine takes about a month.
|
[QUOTE=lindee;240532]I've been thinking about doing P-1 on a large exponent sometime. Does anyone know what the current Prime95's upper limit is?[/QUOTE]
Up limit of exponent? George should know. |
Also, at that upper limit, be aware you'll need a bucketful of RAM and a lot of patience to process a single assignment. Last data I saw said that the upper limit for 32M FFTs is 596000000. According to [url=http://mersenne-aries.sili.net/prob.php?guess=n&exponent=596000000]my tools' calculations[/url]:[quote]M596000000, factored to 80 bits, with B1=5,505,000 and B2=137,625,000
Probability = 5.39852% Should take about 508.08 GHz-days[/quote]Assuming a modern desktop can give roughly 4GHz-days/day, it should take 4-5 months. Having 8+GB of RAM assigned would probably be desirable. |
[QUOTE=James Heinrich;240626]Last data I saw said that the upper limit for 32M FFTs is 596000000.[/quote]
That's right, the current maximum exponent that PrimeNet can handle is 596 million (or, strictly speaking, max{p | p is prime and p < 596e6}). Note that such an exponent should first be trial factored to 79 bits, then P-1'ed, and then trial factored to 80 bits. You'd get just shy of 411 GHz-days for carrying the exponent from 64 bits to 80 bits. The P-1 would be another 508 GHz-days. But if you got lucky, you'd be saving 2 LL tests, each of which would cost 16,440 GHz-days, or a total of 90 [B]GHz-years[/B] for both! Maybe the GPU folk can chime in - how long would a test of this size take on the newest, fastest GPUs? |
[QUOTE=NBtarheel_33;240660]That's right, the current maximum exponent that PrimeNet can handle is 596 million (or, strictly speaking, max{p | p is prime and p < 596e6}).
Note that such an exponent should first be trial factored to 79 bits, then P-1'ed, and then trial factored to 80 bits. You'd get just shy of 411 GHz-days for carrying the exponent from 64 bits to 80 bits. The P-1 would be another 508 GHz-days. But if you got lucky, you'd be saving 2 LL tests, each of which would cost 16,440 GHz-days, or a total of 90 [B]GHz-years[/B] for both! Maybe the GPU folk can chime in - how long would a test of this size take on the newest, fastest GPUs?[/QUOTE] Working with msft program on a GTX275, exponent 124M: I'm getting 39 msec/iter I read that performance on GTX480 is 2.2x higher. The downsize is that you have to choose a base-2 FFT, so try and take an exponent just before the limit of FFT cutoff. Luigi |
Thanks, the calculator was one of the things I was looking for.
The TF for an exponent close to 596000000 shouldn't take too long, maybe a little over a week at most. I have a GTX 460 and I have been getting about 60 GHz-days per day when factoring to higher bit levels. M960000011 from 78 to 79 took about 25 hours. I've never done any other work types on GPU. The P-1 sounds doable since I have some good machines available. I'll probably do some smaller assignments first. |
[QUOTE=KingKurly;240519]There is one person who has 320.235 GHz-days but only two assignments completed, and there are two people who have 283.642 GHz-days and still only two assignments completed. How large are these numbers that they're P-1ing that they are getting *SO* much credit per assignment?
< snip > Any thoughts, ideas, opinions, insight? Thanks in advance.[/QUOTE]Perhaps a bug in P-1 crediting that I encountered might still be present? (I don't recall a specific notice of its fix.) I did some P-1 in stages, such as B1=100000, then B1=200000, then B1=300000, ... all on the same exponent, each time starting from where the previous run had stopped. When I reported all those results, I was credited for each one as though I had restarted each run from 0, instead of from where the preceding run had stopped. If there were N such equal successive intervals run on a number, then my given credit was approximately proportional to N*(N+1)/2 instead of just N for that exponent. It put me in the Top Producers for a while. Perhaps those assignments were performed, and reported, in a similar manner, on not-so-large exponents? If there were 10 intervals, the buggy credit would be about 5 times correct credit. For 50 intervals, buggy credit would be about 25 times correct credit. |
I just re-P-1'd a dozen exponents in the 44-45M range that had very low B1/B2 and found 1 factor.:
[CODE] Exponent TF B1 B2 45300037 68 1024 16384 45300103 68 1024 16384 45300169 68 1024 16384 45300209 68 1024 16384 --- 11350339302653390456633 45300341 68 1024 16384 45300361 68 1024 16384 45300373 68 1024 16384 45300413 68 1024 16384 44000009 69 2048 32768 44000111 69 2048 32768 44000113 69 2048 32768 44000197 68 2048 32768[/CODE] |
[QUOTE=petrw1;247000]I just re-P-1'd a dozen exponents in the 44-45M range that had very low B1/B2[/QUOTE]How do you query PrimeNet to find exponents that have had P-1 on certain bounds, like this?
|
[URL]http://www.mersenne.org/report_factoring_effort/[/URL]
|
| All times are UTC. The time now is 22:58. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.