20200805, 17:44  #1  
Random Account
"Norman D. Powell"
Aug 2009
Indiana, USA.
1917_{10} Posts 
Pick Your Poisson: P1 Bounds
My old friend, P1 bounds, is tasking me again. I have an exponent in the area of 99,8xx,xxx which I am currently running. Below are the apparent default bounds based on the worktodo line:
Quote:


20200807, 08:19  #2  
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
3·1,951 Posts 
Quote:


20200807, 10:25  #3  
"Mihai Preda"
Apr 2015
3·11·41 Posts 
Quote:
Based on my intuition (which is based on a few articles, among others https://www.researchgate.net/publica..._probabilities which has a handy table on page 12); The idea is that there is a lot of benefit from a very large B2. Either use the default GpuOwl bounds of 1M/30M, or lower the B1 to a value of choice between 500K and 1M, but keep B2 to 30M. Recently PRP got "cheaper" (so P1 got relativelly "more expensive"), so the P1 bounds need not be increased if efficiency is the goal. That's why I suggested "lower B1" instead of "increase B2". (the P1 probability calculators we have now may be a bit off) Last fiddled with by preda on 20200807 at 10:41 

20200807, 14:51  #4  
Random Account
"Norman D. Powell"
Aug 2009
Indiana, USA.
3^{3}×71 Posts 
Quote:
I looked a George's "Math" page on mersenne.org. Specifically, P1. It is not written in a way which I can understand. Still, I tried. Somebody here told me a couple of years ago that multiple factors can be found between 0 (zero) and B1, but only one above B1. Would lowering B1 not decrease the possibility? I suppose what I have been looking for is a relationship between bound sizes and powers of two. It seems like there would be one somewhere. About a yer ago, I found a 39digit factor in the process of running a really small exponent using a B1 of 1,000,000. By the way. I decided to give my antique Core2Duo a shot with a PRPCF by running it with Prime95. Only 11 hours each. I was expecting much more. 

20200807, 23:06  #5 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
5,011 Posts 
Use big enough bounds that PrimeNet will retire the need for P1 from the exponent.
Otherwise, unless you find a factor, it's wasted cycles that someone else will have to duplicate in their run to big enough bounds to retire the P1 need for the same exponent. The PrimeNet bounds on mersenne.ca are sufficient. Last fiddled with by kriesel on 20200807 at 23:07 
20200808, 15:51  #6  
Random Account
"Norman D. Powell"
Aug 2009
Indiana, USA.
1917_{10} Posts 
Quote:
I am sure you remember M1277. In 2017, someone ran a P1 on it with B1 at 5trillion and B2 at 400trillion. I suppose whoever did not mind the wait. 

20200810, 02:56  #7 
"Mihai Preda"
Apr 2015
3·11·41 Posts 
Of course "optimal bounds" depends on the "factored up to" bits. A first approximation for 100M exponents may be:
factoredto, B1, B2, chance 77: 1.2M 40M 4.2% 78: 1M 35M 3.6% 79: 1M 30M 3.2% 80: 0.9M 25M 2.7% So, if you use 1M/30M (the default) you should be "just fine" for any bitlevel :) PS: and don't use my previous recommended values of 500K/30M, B1 too low relative to optimal. Last fiddled with by preda on 20200810 at 03:01 
20200814, 16:58  #8  
Random Account
"Norman D. Powell"
Aug 2009
Indiana, USA.
3^{3}×71 Posts 
It appears firsttime P1 tests from Primenet have passed 100,000,000 in magnitude. So, I have decided to run a test to see what the time cost would be with gpuOwl:
Quote:
Quote:


20200817, 11:49  #9 
"Mihai Preda"
Apr 2015
3×11×41 Posts 
There is a new P1 calculator in the pm1/ folder in GpuOwl. It comes as a python script ("pm1.py") and a C++ executable ("pm1"). The C++ version is very simple and only outputs the computed probability of a factor given the exponent, factoredto bits, B1, B2. The python script has an effort model (that can be tweaked in the source if desired), and based on that can propose some sort of "good" bounds. Example:
Code:
~/gpuowl/pm1$ ./pm1 100000000 77 1000000 30000000 3.91% (firststage 1.53%, secondstage 2.38%) Code:
/gpuowl/pm1$ ./pm1.py 100000000 77 Min: B1= 260K, B2= 7000K: p=2.38% (0.80% + 1.58%), work=0.96% (0.45% + 0.52%) Mid: B1= 600K, B2= 20000K: p=3.38% (1.22% + 2.16%), work=2.41% (1.04% + 1.40%) Big: B1= 1000K, B2= 34000K: p=4.01% (1.53% + 2.47%), work=3.99% (1.73% + 2.31%) Last fiddled with by preda on 20200817 at 11:54 
20200817, 13:52  #10 
Random Account
"Norman D. Powell"
Aug 2009
Indiana, USA.
3^{3}·71 Posts 

20200817, 15:38  #11 
"TF79LL86GIMPS96gpu17"
Mar 2017
US midwest
5,011 Posts 
I have not incorporated any of the tools folder in the Windows builds I've posted. Until recently it was all python or shell script, no c. I haven't identified a python compilation method yet that both (a) succeeds and (b) produces code small enough to fit in a forum attachment. (But I have eliminated several candidate approaches.)
Last fiddled with by kriesel on 20200817 at 15:39 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Poisson Distribution  pepi37  Miscellaneous Math  6  20180828 02:10 
What Bounds to choose, and what are Bounds  144  Information & Answers  5  20170315 13:36 
Poisson processes prescription, please?  Fusion_power  Information & Answers  5  20070815 14:20 
Help me pick a math course.  jasong  Math  9  20050311 21:04 
Pick a stone, or two, .... or three  Wacky  Puzzles  5  20030624 16:11 