mersenneforum.org > Data Let's Optimize P-1 for low exponents. TL;DR in post #1. More in posts 60 and 61.
 Register FAQ Search Today's Posts Mark Forums Read

2022-01-17, 18:52   #56
firejuggler

"Vincent"
Apr 2010
Over the rainbow

2×3×11×43 Posts

Quote:
 Originally Posted by firejuggler I can give you timing for my working range 3core/1 worker 10 Gb of mem 8.5M/1.56M: 448k/ 512k : 1550 sec/1000 sec

Amend that to 1200/800 second, my system was busy.

 2022-01-17, 19:44 #57 firejuggler     "Vincent" Apr 2010 Over the rainbow 2×3×11×43 Posts 4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD 8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD 17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD wich feel, about the same? Last fiddled with by firejuggler on 2022-01-17 at 20:23 Reason: Adding the 17M stats
2022-01-17, 21:36   #58
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3×11×157 Posts

Quote:
 Originally Posted by firejuggler 4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD 8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD 17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD wich feel, about the same?
Looks like 2x (1.95x) fits your PC pretty good.
We are probably in the ball park with 2x ... or 2.2x ... or 1.9x

 2022-01-18, 04:32 #59 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3·11·157 Posts Signing off until April So doing some rough calculations using this which admittedly is not accurate for v30.8 ... but still usable. Looking at exponents where the current B1 is even equal to the recommended B1; but because the new B2 is SOOOO much higher I'm seeing odds of finding a factor close to 10% higher. I realize now though that I cannot only look at current B1 vs new B1; I also have to look at the current B2 to be sure it is NOT from a 30.8 run (in other words many thousands of times larger than the current B2 rather than 20x or 30x.) So we would still start with the exponents where the current B1 is the smallest ratio of the new B1 and work down past 10x and even past 5x ... where the current B2 is not VERY VERY BIG. In any case this could be my last post with any cyphering before I return late March. ====================== For those who want to give a try the best I can suggest is using post #23 for suggested B1 values ... it uses this formula referred to in post #39: 2.2^LOG(20,000,000/,2)*1,000,000 ... and post #46 for B1 adjustment where available RAM is somewhat lower or higher than 16GB. I'm leaning towards the top table that equalizes the factor success percent. Thanks and I'll check back in April ... at which time I'll clean up the recommendations and hopefully start myself. Wayne
 2022-03-13, 21:10 #60 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 10100001111012 Posts As we prepare to start this project here are my suggestions I openly welcome any opinions or counter points. (Note: the next post discusses related work opportunities) If you haven't or prefer not to read all the above posts. In a nutshell we want to more aggressively P1 lower exponents using the impressive speed of version 30.8...and factor lots more exponents: You need to be at that version to adequately participate here. I suggest we start at the lower end (< 10M?) and work our way higher as long as it is still productive and interesting. If you are interested you need to "reserve" a range of exponents and manually generate the appropriate Pminus1 assignments... unless someone smarter than me can automate it. Start with exponents in your range that have the current lowest bounds. But since the B2 that will be used is Sooo much higher than in previous versions it will be beneficial to reprocess most exponents in most ranges. I'll generate a list of the most fruitful ranges later in March when I'm home. In the meantime you're on your own... you can do it! Chris is working on a supporting GPU72 chart to help as he did with Under 2000. ============== Proposed minumum B1 values in the table below based on about 16GB RAM allocated. Or calculate for your exponent range using: Code: 2.2^LOG(20,000,000/,2)*1,000,000 If your RAM is somewhat different it is suggested that B1 be adjusted with this function: Code: sqrt( 16 / your GB RAM) × proposed B1 This adjustment is more important if you have less than 16GB. Code: [Exponent B1 B1-Neat 78125 548,758,735 548,800,000 156250 249,435,789 249,400,000 312500 113,379,904 113,400,000 500000 66,427,649 66,400,000 625000 51,536,320 51,500,000 750000 41,883,644 41,900,000 1000000 30,194,386 30,200,000 1250000 23,425,600 23,400,000 1500000 19,038,020 19,000,000 2000000 13,724,721 13,700,000 2500000 10,648,000 10,600,000 3000000 8,653,645 8,700,000 4000000 6,238,510 6,200,000 5000000 4,840,000 4,800,000 6000000 3,933,475 3,900,000 7000000 3,300,838 3,300,000 8000000 2,835,686 2,800,000 9000000 2,480,116 2,500,000 10000000 2,200,000 2,200,000 11000000 1,973,960 2,000,000 12000000 1,787,943 1,800,000 13000000 1,632,344 1,600,000 14000000 1,500,381 1,500,000 15000000 1,387,133 1,400,000 16000000 1,288,948 1,300,000 17000000 1,203,057 1,200,000 18000000 1,127,325 1,100,000 19000000 1,060,082 1,100,000 20000000 1,000,000 1,000,000 Last fiddled with by petrw1 on 2022-03-13 at 21:32 Reason: Chris chart
 2022-03-13, 21:30 #61 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3×11×157 Posts Related work opportunities 1. Some of you want to try to get all 10K ranges under 200 factors remaining as a follow-up to the current Under 2000 project. This is not contrary to the discussion in the previous post. It only requires that you may need to choose even higher B1 values for these 10K ranges of interest and/or TF more. 2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored. 3. So how can GPUs contribute? I'm not sure how they can help with deep P1, at least until GPUOwl or other GPU P1 software has been retrofitted to 30.8 functionality. A couple thoughts: a. Tidy up the TF for all the lower ranges, bringing all exponents to the same appropriate TF level. mikr has been systematically TF'ING all lower exponents to 71 bits b. Help those in point 1 above get ranges under 200 factors. c. Mainstream leading edge TF for PRP work.
2022-03-14, 19:37   #62
nordi

Dec 2016

2×59 Posts

Quote:
 Originally Posted by petrw1 2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored.
Such a list can be generated with the https://www.mersenne.ca/morefactors.php page.

I'm currently working on the 12.4M range for already factored exponents, because they will soon be PRP-checked by the folks doing PRP-C work. Every factor that I find now (instead of later) means one less PRP check is needed. If someone wants to take the 12.5M range, I'd be happy to share. ;-)

 2022-03-21, 14:05 #64 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3·11·157 Posts @lisanderke Looks good to me. Thanks for the explanation and the help.
 2022-03-23, 01:27 #65 DrobinsonPE   Aug 2020 137 Posts Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000. I will start by running a test with the assignment below to make sure it works and see how long it takes and then start generating Pminus1 assignments for the rest of the range. Pminus1=N/A,1,2,4200109,-1,6200000,0,72 Let me know if someone else is already working here or if there is a better place to start.
2022-03-23, 03:41   #66
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by DrobinsonPE Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000.
Thanks, enjoy, keep us posted.
I have no problem with your proposed strategy.

Just some food for thought for you or others...
When i initially started analyzing these low exponents for candidates I too thought I should skip any where the current B1 is more than half of the proposed B1.

When I realized the new B2 is soooo much higher than the current B2, even rerunning exponents with the same B1 may have a 3% - 4% success rate.

So, an alternative strategy might be to look for exponents where the current B2 is less than some percentage of the new B2. You may need to run a test to determine your new B2 since it is very RAM sensitive.

According to the prob.php function at mersenne.ca a 10x increase in B2 adds about 2% to the success rate.

Thanks

 Similar Threads Thread Thread Starter Forum Replies Last Post Ilya Gazman Factoring 6 2020-08-26 22:03 kladner Lounge 3 2018-10-01 20:32 gd_barnes No Prime Left Behind 6 2008-02-29 01:09 jasong Marin's Mersenne-aries 7 2006-12-22 21:59 GP2 Software 10 2003-12-09 20:41

All times are UTC. The time now is 02:20.

Mon May 23 02:20:56 UTC 2022 up 39 days, 22 mins, 0 users, load averages: 1.25, 1.30, 1.24