Run P1 before PRP tests
The current timing for PRP is about 2 hours. Since most Mersenne numbers with small factors have very little P1 done, I would like Prime95 to be changed so it performs P1 with bounds B1 = 500000, B2 = 15e6 before attempting PRP. In this way more factors can be found, and no time is lost running PRP on numbers that can be easily factored.

Is it worthwhile?
How much time will the P−1 test take, and how does it compare to the time taken by the PRPcofactor test? This question will depend on the exponent range, since the time complexity required for PRPcofactor testing rises more sharply than it does for P−1 testing, as the exponent increases. And what percentage of exponents will have new factors discovered by P−1? For the ranges we are currently doing, P−1 will probably find factors for only a very small number of exponents, and it will be simple and quick to rerun PRP for the new cofactors of those few exponents. Requiring mandatory prior P−1 instead for every exponent would enormously slow down the PRP testing work. Currently nearly all the new factors of small exponents are being found by user TJAOI with a "by k" methodology. Most of the rest are being found by ECM, and a few by TF. Pretty much none are being found by P−1, but hardly anyone (and maybe no one) is even attempting these. It's not surprising, since there is no Primenet credit given for unsuccessful P−1 testing of exponents that already have at least one known factor. I myself did some nocredit P−1 testing of small exponents. (Most of that was done about a year ago). For example, I tested the 400k range with B1=5e6 B2=40e6 and found only six new factors (for [M]M416497[/M], [M]M432349[/M], [M]M436739[/M], [M]M450019[/M], [M]M450383[/M], [M]M499033[/M]). I also did the ranges below 400k and did find considerably more factors there, but the 400k range itself was mostly unfruitful, so I didn't continue past 500k. Part of the reason, I think, is that ECM already found many of the factors that P−1 would have found. And since a year ago, even more ECM and also TJAOI discoveries have been made. Before bundling this P−1 testing into PRPcofactor testing as a mandatory prerequisite, it would be essential to do a proofofconcept. Choose a range and P−1 test it and report how many new factors were found. Due to the greater time complexity of PRPcofactor testing vs. P−1 testing, there will be some point where it would make sense to do P−1 testing first, but we are probably not near that point yet. (The proofofconcept testing suggested above might prove otherwise). However, at some point, nearly all the exponents already started getting routine P−1 testing before LL testing. So there is at best only a limited range of exponents (bounded both below and above) where it might make sense to do more P−1 testing, and it's not immediately obvious whether this range is the empty set. 
I have ran P1 algorithm with the bounds cited above on exponents less than 2.3M with known factors and the computer found several hundred new factors.
Also the amount of ECM for exponents above that number is very low, especially for Mersenne numbers with known factors. 
I accept the fact that you know WAYYY more about P1 than I do so feel free to elaborate on the following as you see fit.
What I have done is created a table with various B1/B2 bounds (yours is first) for P1 in the range you suggest. This specific table uses Exponent 2,600,001. I'll explain the columns: B1/B2: various choices for B1 and B2 bound for the new P1 Pct: The odds of it finding a factor strictly based on this: [url]http://www.mersenne.ca/prob.php[/url] GD: The GhzDays to do the P1 based on the same URL above.  Then 4 pairs of columns as follows (Most Exponents in the 2.6M range have existing P1 runs with an estimated success rate of about 3.25%  4%): 3.25: The number at the top of the pair of columns is the odds of the existing P1 run finding a factor. PerFac: The calculated number of P1 with the new bounds required to find a factor taking into account the odds from the prior P1 run GD/F: The calculated GhzDays required to find a new factor based on the above parms. I suggest we want to minimize this number. For example: From the first row: Statistically, 68.97 P1 runs would be required to find a factor based on the net Percent of 4.70  3.25 (New  Existing P1). Then based on 0.13 GhzDays per new P1 it would take 8.97 total GhzDays of P1 work to find the new factor. NOTE: What this table does NOT take into account is the ECM work already done on these Exponents. More than I could bite off. [CODE] 3.25 3.50 3.75 4.00 B1 B2 Pct GD PerFac GD/F PerFac GD/F PerFac GD/F PerFac GD/F 500000 15000000 4.70 0.13 68.97 8.97 83.33 10.83 105.26 13.68 142.86 18.57 1000000 20000000 5.45 0.21 45.45 9.32 51.28 10.51 58.82 12.06 68.97 14.14 2000000 40000000 6.62 0.42 29.67 12.46 32.05 13.46 34.84 14.63 38.17 16.03 2600000 52000000 7.08 0.53 26.11 13.84 27.93 14.80 30.03 15.92 32.47 17.21 2600000 26000000 6.32 0.39 32.57 12.70 35.46 13.83 38.91 15.18 43.10 16.81 3000000 60000000 7.34 0.61 24.45 15.01 26.04 15.99 27.86 17.10 29.94 18.38 4000000 80000000 7.86 0.82 21.69 17.77 22.94 18.78 24.33 19.93 25.91 21.22 5200000 104000000 8.33 1.06 19.69 20.87 20.70 21.95 21.83 23.14 23.09 24.48[/CODE] 
For Mersenne numbers with small known factors in the range about 4.2M, there is no previous P1 done, no ECM done and they were factored up to 64 bits. That means that I would find a factor in about 21 attempts.
At this moment I am running P1 starting from 4.2M up. 
Does P1 actually work on exponents with known factors like this?:
Pminus1=N/A,1,2,7508981,1,100000000,10000000000,"45053887,60071849,285341279,585700519,26356523311,20333239254737,18694135089678809,281287549065522023,346309182073938289,367107436768162151,1211907173840894224264391" I did some of these, but got error messages and no P1 show up on the exponents: [CODE]Sending result to server: UID: athath/ec21xb, M7508981 completed P1, B1=105000000, B2=3000000000, E=12, Wg8: 7B49A837 Sending result to server: UID: athath/ec21xb, M9100919 completed P1, B1=50000000, B2=2000000000, E=12, Wg8: 91BADB7F PrimeNet error 40: No assignment P1 result for M7508981 was not needed PrimeNet error 40: No assignment P1 result for M9100919 was not needed[/CODE] 
Just send them using the manual testing menu:
[url]https://www.mersenne.org/manual_result/[/url] 
Thanks, I was pretty sure I had already tried this, but I guess I forgot.

Has anyone found factors with P1 using the already known factors in the assignment?

[QUOTE=ATH;472330]Has anyone found factors with P1 using the already known factors in the assignment?[/QUOTE]
Yes, I've done this. For instance, for the exponents linked in the post above. I did this with UsePrimenet=0 and submitted the results manually. You get no Primenet credit (unless you found a factor), but it does add a line to the History section. 
I can create a table using various B1/B2 bounds and looking up the following 2 values from
[url]http://www.mersenne.ca/prob.php[/url]  Percent: The odds of it finding a factor strictly based on this:  GhzDays: to do the P1 . But that is time consuming Can anyone point me at the formula used to compute these values myself knowing the Exponent, B1, B2 and Bits Factored used in that website. When I try the "show source" link I get: [CODE]Fatal error: Call to undefined function htmlentities_safe() in /var/www/vhosts/mersenne.ca/httpdocs/prob.php on line 15 [/CODE] 
All times are UTC. The time now is 08:19. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2020, Jelsoft Enterprises Ltd.