I'm working on 19.4 But I'm a little worried.
I've completed 175 P1 at B1=800K B2=780M which is roughly a [URL="https://www.mersenne.ca/exponent/19400573"]7.2% chance of factor[/URL]. Accounting for the original ~2.5% chance of factor this is a 4.8% chance of factor but I've only found 1 factor which is 2 in 1000ths bad luck. [CODE] >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).cdf(1) 0.0017342175512011905 >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).ppf(0.01) 3.0 >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).ppf(0.99) 16.0 [/CODE] Should I stay the course and hope that my luck improves; e.g. mathematically I should still expect 90 factors if I'm just got less "unlucky". Or should I preemptively increase my B1/B2 because it's possible some P1 / TF work was underreported? 
[QUOTE=axn;602135]Thoughts?[/QUOTE]
Makes total sense. I have to split that report into two separate scripts, and then work from there. But... My dev cycles are effectively zero for the next couple of days... [QUOTE=axn;602135]EDIT: While we're asking for stuff, dark mode?[/QUOTE] Ah... Man... You've just put an idea in my head which probably won't leave until I implement it... 9) 
[QUOTE=SethTro;602246]I'm working on 19.4 But I'm a little worried.
I've completed 175 P1 at B1=800K B2=780M which is roughly a [URL="https://www.mersenne.ca/exponent/19400573"]7.2% chance of factor[/URL]. Accounting for the original ~2.5% chance of factor this is a 4.8% chance of factor but I've only found 1 factor which is 2 in 1000ths bad luck. [CODE] >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).cdf(1) 0.0017342175512011905 >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).ppf(0.01) 3.0 >>> scipy.stats.binom(175, (0.072  0.025) / (1  0.025)).ppf(0.99) 16.0 [/CODE] Should I stay the course and hope that my luck improves; e.g. mathematically I should still expect 90 factors if I'm just got less "unlucky". Or should I preemptively increase my B1/B2 because it's possible some P1 / TF work was underreported?[/QUOTE] Patience grasshopper. LOL. Been there. My fortune teller tells me you're going to find 8 in the next 175 attempts. 
[QUOTE=SethTro;602246]I'm working on 19.4 But I'm a little worried.
I've completed 175 P1 at B1=800K B2=780M which is roughly a [URL="https://www.mersenne.ca/exponent/19400573"]7.2% chance of factor[/URL]. Accounting for the original ~2.5% chance of factor this is a 4.8% chance of factor but I've only found 1 factor.[/QUOTE] My common sense knows better but ... @George is it just a coincidence or is there any chance at all that 800K is a v30.8 Achilles Heel? Seth here is having bad luck and firejuggler in the "Found a factor " thread missed at least 1 factor also using B1=800K. 
[QUOTE=petrw1;602260]My common sense knows better but ... @George is it just a coincidence or is there any chance at all that 800K is a v30.8 Achilles Heel?
Seth here is having bad luck and firejuggler in the "Found a factor " thread missed at least 1 factor also using B1=800K.[/QUOTE] It's always possible. Please do a run with roundoff error checking turned on. If there are roundoffs above say 0.43 then it is possible other runs are getting fatal roundoffs above 0.5. This new stage 2 code uses polynomial multiplications which affects roundoffs in new ways. There could easily be a bug in my accounting for this. BTW, since roundoff is affected by polynomial size, you have to test the specific FFT length (800K) and the specific polynomial length (dictated by the amount of memory specified). Next, I'd do a P1 run to see if it finds a known factor. 
I would be willing to start P1 on 15.20 and work up if no one has claimed it yet. Looking at the Let's Optimize P1 for low exponents thread it suggests B1Neat = 1,400,000 if that is good, I can start creating Pminus1 assignments.
Pminus1=1,2,15200047,1,1400000,0,74 I believe this is a correct assignment for the first number. 
[QUOTE=DrobinsonPE;602273]Looking at the Let's Optimize P1 for low exponents thread it suggests B1Neat = 1,400,000[/QUOTE]
How much RAM are you allocating for stage 2? 
Maybe this is useful: I'm working on he range 14.1M which is similar, and I'm using the bounds B1 = 2M, B2 = 1G. I reserved 20 GB of RAM for Prime95. I found more than 40 factors in the last month.

[QUOTE=axn;602278]How much RAM are you allocating for stage 2?[/QUOTE]
I have a few different computers I could use. The one I was thinking of using has 16GB so 1314 could be allocated for Stage 2. Currently only one computer has 32GB so I could use that one and allocate 2428 for stage 2. Overnight I ran a test with the assignment I created above just to see what size stage 2 mprime would pick and how long it would take. Here is the result with 12.2GB ram. The stage 2 is a little smaller than alpertron is using. [Work thread Mar 21 23:36] P1 on M15200047 with B1=1400000, B2=TBD [Work thread Mar 21 23:36] Using FMA3 FFT length 800K, Pass1=640, Pass2=1280, clm=1, 4 threads [Work thread Mar 22 00:25] M15200047 stage 1 complete. 4040778 transforms. Total time: 2953.477 sec. [Work thread Mar 22 00:25] Inversion of stage 1 result complete. 5 transforms, 1 modular inverse. Time: 4.317 sec. [Work thread Mar 22 00:25] Switching to FMA3 FFT length 896K, Pass1=896, Pass2=1K, clm=1, 4 threads [Work thread Mar 22 00:25] With trial factoring done to 2^74, optimal B2 is 641*B1 = 897400000. [Work thread Mar 22 00:25] If no prior P1, chance of a new factor is 7.15% [Work thread Mar 22 00:25] Estimated stage 2 vs. stage 1 runtime ratio: 1.032 [Work thread Mar 22 00:25] Using 12492MB of memory. D: 3570, 384x1405 polynomial multiplication. [Work thread Mar 22 00:26] Stage 2 init complete. 10747 transforms. Time: 29.207 sec. [Work thread Mar 22 01:14] M15200047 stage 2 complete. 1832535 transforms. Total time: 2871.515 sec. [Work thread Mar 22 01:14] Stage 2 GCD complete. Time: 2.813 sec. [Work thread Mar 22 01:14] M15200047 completed P1, B1=1400000, B2=898037070, Wi8: BCD819E8 Looks like Xyzzy want to do 15.2 so I will go play around in the "Let's Optimize P1 for low exponents thread" This effort is near the end and I do not want to get in the way with my learning experiment. the other one is just starting. 
Please reserve 15.2M for us.
:mike: 
[QUOTE=DrobinsonPE;602290]I have a few different computers I could use. The one I was thinking of using has 16GB so 1314 could be allocated for Stage 2. Currently only one computer has 32GB so I could use that one and allocate 2428 for stage 2.
Overnight I ran a test with the assignment I created above just to see what size stage 2 mprime would pick and how long it would take. Here is the result with 12.2GB ram. The stage 2 is a little smaller than alpertron is using. [Work thread Mar 21 23:36] P1 on M15200047 with B1=1400000, B2=TBD [Work thread Mar 21 23:36] Using FMA3 FFT length 800K, Pass1=640, Pass2=1280, clm=1, 4 threads [Work thread Mar 22 00:25] M15200047 stage 1 complete. 4040778 transforms. Total time: 2953.477 sec. [Work thread Mar 22 00:25] Inversion of stage 1 result complete. 5 transforms, 1 modular inverse. Time: 4.317 sec. [Work thread Mar 22 00:25] Switching to FMA3 FFT length 896K, Pass1=896, Pass2=1K, clm=1, 4 threads [Work thread Mar 22 00:25] With trial factoring done to 2^74, optimal B2 is 641*B1 = 897400000. [Work thread Mar 22 00:25] If no prior P1, chance of a new factor is 7.15% [Work thread Mar 22 00:25] Estimated stage 2 vs. stage 1 runtime ratio: 1.032 [Work thread Mar 22 00:25] Using 12492MB of memory. D: 3570, 384x1405 polynomial multiplication. [Work thread Mar 22 00:26] Stage 2 init complete. 10747 transforms. Time: 29.207 sec. [Work thread Mar 22 01:14] M15200047 stage 2 complete. 1832535 transforms. Total time: 2871.515 sec. [Work thread Mar 22 01:14] Stage 2 GCD complete. Time: 2.813 sec. [Work thread Mar 22 01:14] M15200047 completed P1, B1=1400000, B2=898037070, Wi8: BCD819E8 Looks like Xyzzy want to do 15.2 so I will go play around in the "Let's Optimize P1 for low exponents thread" This effort is near the end and I do not want to get in the way with my learning experiment. the other one is just starting.[/QUOTE] 1. I think your suggested bounds and RAM allocation are reasonable to either finish this project or move on to the next. 2. You could also take 19.4M. I was going to do it but I won't get to it before it's needed here. OOPS forgot SethTro has it. 3. Thanks for whatever help you provide. 
All times are UTC. The time now is 01:29. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2023, Jelsoft Enterprises Ltd.