mersenneforum.org > Data Let's Optimize P-1 for low exponents. TL;DR in post #1. More in posts 60 and 61.
 Register FAQ Search Today's Posts Mark Forums Read

 2022-01-10, 23:54 #23 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3·11·157 Posts Possible B1 values If I choose 1M as the desired B1 for exponents at 20M and use George's suggested "if exponent is halved increase B1 by 2.2 factor" I get the following table. Column 3 (B1-Neat rounds it to nearest 100K) Code: Exponent B1 B1-Neat 78125 548,758,735 548,800,000 156250 249,435,789 249,400,000 312500 113,379,904 113,400,000 500000 66,427,649 66,400,000 625000 51,536,320 51,500,000 750000 41,883,644 41,900,000 1000000 30,194,386 30,200,000 1250000 23,425,600 23,400,000 1500000 19,038,020 19,000,000 2000000 13,724,721 13,700,000 2500000 10,648,000 10,600,000 3000000 8,653,645 8,700,000 4000000 6,238,510 6,200,000 5000000 4,840,000 4,800,000 6000000 3,933,475 3,900,000 7000000 3,300,838 3,300,000 8000000 2,835,686 2,800,000 9000000 2,480,116 2,500,000 10000000 2,200,000 2,200,000 11000000 1,973,960 2,000,000 12000000 1,787,943 1,800,000 13000000 1,632,344 1,600,000 14000000 1,500,381 1,500,000 15000000 1,387,133 1,400,000 16000000 1,288,948 1,300,000 17000000 1,203,057 1,200,000 18000000 1,127,325 1,100,000 19000000 1,060,082 1,100,000 20000000 1,000,000 1,000,000
 2022-01-11, 00:28 #24 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3·11·157 Posts George is much more qualified for this point but I'll start. Using this ... which might NOT be accurate for v30.8. and based on my PC's various RAM I get these specs: My PC with 24.5GB RAM for P-1 with B1=1M/B2=328M gets 5.74% success rate for 17.66 GhzDays My PC with 12GB RAM (about half) to get the same success rate needs B1=1.3M/B2=260M for 14.45 GDs My PC with 6.5GB RAM (about half again) to get the same success rate needs B1=1.8M/B2=200M for 11.82GDs. The actual GDs rewarded are about 15% higher than above. Does this seems somewhat reasonable?
2022-01-11, 00:36   #25
chalsall
If I May

"Chris Halsall"
Sep 2002

7×1,493 Posts

Quote:
 Originally Posted by petrw1 George is much more qualified for this point but I'll start. [SNIP] Does this seems somewhat reasonable?
Please forgive me for this James, but...

There was a time a while ago when we were working "blind". When the GPUs turned TF'ing upside-down. James managed to inform the discussion as to what was "sane". This was based on empirical data and peer-reviewed analysis.

Just putting that out there...

2022-01-11, 00:45   #26
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

10100001111012 Posts

Quote:
 Originally Posted by chalsall Please forgive me for this James, but... There was a time a while ago when we were working "blind". When the GPUs turned TF'ing upside-down. James managed to inform the discussion as to what was "sane". This was based on empirical data and peer-reviewed analysis. Just putting that out there...
Sorry, I mentioned George as the man behind 30.8; I didn't mean to exclude or diminish the collective genius of others.

Empirical data and peer reviews greatly appreciated.

 2022-01-11, 01:07 #27 VBCurtis     "Curtis" Feb 2005 Riverside, CA 5,279 Posts My view: Don't bother re-doing P-1 unless you're adding a zero to the already-done B1 bound. But since a couple of folks seem to want to do another project after under-20k is done: How about, right now, doing bigger P-1 than necessary to clear the ranges left? Those who are doing near the minimum B1 to clear the number of factors needed are, in a sense, fouling those exponents from future P-1. Better to do it once, do it right- go deep enough on P-1 now that nobody would "ever" want to re-do it. I've been helping Masser on the hard-to-finish ranges he chooses. In 8.6M, I used B1 around 8M; now in 17.7M I'm doing 4M/4G for bounds. It's not the most efficient path to finding factors, but in the case where some of you keep doing P-1 after the project finishes it's hard to set an optimal set of bounds now since there is a "next project" of some sort coming. A simpler "next project" is under-200 for each 0.01M range. That'll require interestingly-large P-1 bounds in some ranges, and just a few factors in others. As with the current project, that variety will attract a wider set of "we like useless projects with clear goals" users than setting some arbitrary bounds for P-1 and TF work. The downside is that some ranges are nigh impossible... for now. People with 64GB+ machines might consider doing some big-bound P-1 near the first-time wavefront- that would actually help the overall mersenne-searching project. Last fiddled with by VBCurtis on 2022-01-11 at 01:07
2022-01-11, 01:07   #28
chalsall
If I May

"Chris Halsall"
Sep 2002

7×1,493 Posts

Quote:
 Originally Posted by petrw1 I didn't mean to exclude or diminish the collective genius of others.
I don't think any slight could have possibly been interpreted... We're not Snowflakes around these-here-parts. Possibly Snowbirds, in some cases...

My post was more along the lines of "I know this is an ask... But...

2022-01-11, 01:22   #29
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by VBCurtis My view: Don't bother re-doing P-1 unless you're adding a zero to the already-done B1 bound.
I think that would be the net result; we would only redo those that are significantly lower than the "chosen" bounds.

Quote:
 But since a couple of folks seem to want to do another project after under-20k is done: How about, right now, doing bigger P-1 than necessary to clear the ranges left? Those who are doing near the minimum B1 to clear the number of factors needed are, in a sense, fouling those exponents from future P-1. Better to do it once, do it right- go deep enough on P-1 now that nobody would "ever" want to re-do it. I've been helping Masser on the hard-to-finish ranges he chooses. In 8.6M, I used B1 around 8M; now in 17.7M I'm doing 4M/4G for bounds. It's not the most efficient path to finding factors, but in the case where some of you keep doing P-1 after the project finishes it's hard to set an optimal set of bounds now since there is a "next project" of some sort coming.
I agree. That is my intent, going forward, for the ranges I clear. There are are others as well using bounds much higher than necessary for under-2000 which will benefit this next project...if it takes off.

Quote:
 A simpler "next project" is under-200 for each 0.01M range. That'll require interestingly-large P-1 bounds in some ranges, and just a few factors in others. As with the current project, that variety will attract a wider set of "we like useless projects with clear goals" users than setting some arbitrary bounds for P-1 and TF work. The downside is that some ranges are nigh impossible... for now.
This has been suggested twice before. The rub is your last statement "nigh impossible". Being a little OCD I get annoyed by impossible projects.

Quote:
 People with 64GB+ machines might consider doing some big-bound P-1 near the first-time wavefront- that would actually help the overall mersenne-searching project.
Fair point.

2022-01-11, 03:08   #30
VBCurtis

"Curtis"
Feb 2005
Riverside, CA

5,279 Posts

Quote:
 Originally Posted by petrw1 This has been suggested twice before. The rub is your last statement "nigh impossible". Being a little OCD I get annoyed by impossible projects.
This forum is full of impossible projects- most of CRUS, for instance. I maintain that a concrete goal with a variety of ways to move toward that goal is a worthy project, even if the actual finish line is far away; contrasted with "let's do more P-1 because the new software is snazzy," a task that while finite in length is rather arbitrary.

Finding factors for 20k is fun, just like finding primes for CRUS is fun. The fact that the finish line is rather over the horizon shouldn't be a deal killer; see Riesel-base-3 as an example in that other project. :)

2022-01-11, 03:48   #31
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by VBCurtis This forum is full of impossible projects- most of CRUS, for instance. I maintain that a concrete goal with a variety of ways to move toward that goal is a worthy project, even if the actual finish line is far away; contrasted with "let's do more P-1 because the new software is snazzy," a task that while finite in length is rather arbitrary. Finding factors for 20k is fun, just like finding primes for CRUS is fun. The fact that the finish line is rather over the horizon shouldn't be a deal killer; see Riesel-base-3 as an example in that other project. :)
Well nothing has been decided yet, certainly the best project is the one that attracts the most interest.

Or it could be a hybrid as in focus the deep P1 on ranges over 199 togo first.

2022-01-11, 04:40   #32
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by VBCurtis The fact that the finish line is rather over the horizon shouldn't be a deal killer;
This is another consideration....potentially a very distant horizon.

When I analyzed the under 2000 project I could see that with very few exceptions the upper limit was about 60M.
I'm not sure under 200 will have an upper limit; quite possibly there will be some ranges right to 999M.

Right now mersenne.ca does not break down ranges over 100M under 0.1M (i.e. the under 2000 ranges).
But looking at the 99.9M ranges for over 200 there are 21 ranges to go out of 100 and several have more than 20 factors required....that could be 10 TF levels or more.

That said, if the reason for another project like this is to take advantage of v30.8's P-1 power we would need to set an upper limit; the point where v30.8 loses it's luster (unless you give it a LLLLOTTTT of RAM).
As well as the exponents get larger TF becomes more efficient and P-1 less efficient; even v30.8.

2022-01-11, 05:24   #33
VBCurtis

"Curtis"
Feb 2005
Riverside, CA

149F16 Posts

Quote:
 Originally Posted by petrw1 Or it could be a hybrid as in focus the deep P1 on ranges over 199 togo first.
I like this idea best! I'm not sure 20 factors needs 10 TF levels- P-1 with 9% add'l probability does the same, and once that's done there would still be a 1/100 or 1/120 shot for each TF level. For larger exponents where 9% would be enormous... we put those off a little, mix in more TF, or expect some ECM to get 'em.

I think anything under 20M can be slayed with the big P-1 gun, and George has hinted that we may have similar B2 advancements for P+1 and ECM in the future; that would be just the ticket to finish off the tough ranges that I'm mistaken about.

Also, LaurV likes it. The taste police have spoken!

 Similar Threads Thread Thread Starter Forum Replies Last Post Ilya Gazman Factoring 6 2020-08-26 22:03 kladner Lounge 3 2018-10-01 20:32 gd_barnes No Prime Left Behind 6 2008-02-29 01:09 jasong Marin's Mersenne-aries 7 2006-12-22 21:59 GP2 Software 10 2003-12-09 20:41

All times are UTC. The time now is 17:34.

Sun May 22 17:34:19 UTC 2022 up 38 days, 15:35, 1 user, load averages: 2.22, 1.81, 1.54