20100206, 12:18  #1 
Oct 2006
103_{10} Posts 
P1 discussion thread
I'm new to this project, and want to know if there are some reasons which stand against P1 or ecm?
According to the project "XYYXF" a factor with max. 20 digits needs about 30 curves and a B1Value = 2000 to be found with ecm. And if I'm not wrong we're sieving for factors with 16 digits at the moment. So could ecm (P1) be another method for "prechecking" the candidates before their PRPtesting? 
20100206, 12:29  #2 
"Phil"
Sep 2002
Tracktown, U.S.A.
3·373 Posts 
When we started the project, P1 was definitely not an efficient way to eliminate prp candidates, even though our sieve depth was low. Now that prp tests are taking 1.5 days on a single core of my older machine (Pentium D), and probably even the better part of a day on newer, faster machines, we should revisit P1. Can't Prime95 compute an optimal P1 level, if the sieving depth and number of tests saved is set? I've actually been wanting to post on this the last couple of weeks and just haven't gotten around to it. I honestly expect that ECM will not be able to find nearly as many factors per unit of time as P1, but we may be getting to the point where P1 is of more immediate help to the project than further sieving.

20100206, 12:52  #3  
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
2^{2}·11·97 Posts 
Quote:
Code:
Optimal bounds are B1=25000, B2=212500 Chance of finding a factor is an estimated 0.519% Quote:


20100206, 13:23  #4 
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
3·1,973 Posts 
in 620 seconds i did a p1 test on 2^5720440+40291
[Work thread Feb 6 12:50] Optimal bounds are B1=20000, B2=160000 [Work thread Feb 6 12:50] Chance of finding a factor is an estimated 0.425% One prp test would have to take 145882 seconds(40 hours) to be optimal. You plan to do a doublecheck I think so 72941 seconds(20 hours). So as long as one prp test takes longer than 20 hours then p1 would be worthwhile. Last fiddled with by henryzz on 20100206 at 13:23 
20100206, 18:06  #5 
"Phil"
Sep 2002
Tracktown, U.S.A.
3·373 Posts 
Thanks, MiniGeek and Henryzz. The low probability of finding a factor (about 1 in 200 P1 tests) along with the fact that we would have spent almost as much time doing P1 tests as we would have saved indicates that up until now, we haven't missed out on much by not doing P1, but it looks like it is becoming more worthwhile now that prp tests are taking longer. My take on doublechecking is that we may or may not do a doublecheck, so perhaps it might be more appropriate to set the number of prp tests saved to 1, or something close to that. Then P1 becomes primarily a way of speeding up first time prp testing. Don't forget that we are still sieving 2M50M, so there is a chance that a factor found by P1 could later turn up in the sieve. I'm going to play around with it, and perhaps we can set up a new sticky thread for P1 reservations soon.
Last fiddled with by philmoore on 20100206 at 18:12 
20100206, 18:29  #6 
"Phil"
Sep 2002
Tracktown, U.S.A.
3·373 Posts 
I tried "Pfactor=1,2,6069928,40291,50,2" with memory usage at 256K and it started with B1=25000, B2=200000. However, when I reduced the number of prp tests saved to anything from 1 to 1.9, it said that P1 factoring was not needed. So we really are at the margin, where it is just barely justifiable with doublechecking assumed, but I expect that by the time prp testing gets up to 7 million bit exponents, it will be paying off. I'll do some more checking with larger exponents. Thanks, Rincewind, for starting this thread at this time.

20100206, 18:33  #7  
Just call me Henry
"David"
Sep 2007
Cambridge (GMT/BST)
3×1,973 Posts 
Quote:


20100206, 18:49  #8 
"Phil"
Sep 2002
Tracktown, U.S.A.
3·373 Posts 
I don't think that memory was the critical issue, as my B1 and B2 bounds were actually larger than yours. The only thing I changed was the number of prp tests saved.
Last fiddled with by philmoore on 20100206 at 18:50 Reason: Yes, MB! 
20100206, 20:41  #9 
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
4268_{10} Posts 
Assuming 50 bits of sieving, and (first column) tests saved, Prime95 starts wanting to do P1 at n=(second column)M (checking every n=500K).
Code:
1.0 11.0 1.5 7.5 2.0 5.5 Assuming 50 bits of sieving and 1 test saved, Prime95 doesn't think you should do P1 until 11M. Assuming 50 bits of sieving and 1.5 tests saved, Prime95 doesn't think you should do P1 until 7.5M. Assuming 50 bits of sieving and 2 tests saved, Prime95 thinks you should've started P1 at 5.5M. So whether P1 should be done now is entirely dependent on if we expect to double check these numbers. Past 11M it should be done either way. Last fiddled with by MiniGeek on 20100206 at 20:42 
20100206, 20:43  #10 
"Phil"
Sep 2002
Tracktown, U.S.A.
3·373 Posts 
I did a few more experiments, setting the number of prp tests saved to 1.7. My reasoning here is that if and when we get a systematic double check going, it will probably lag behind first time checking by a factor of 2 to 3, which gives us about a 2025% chance of finding the last probable prime before the particular exponent gets doublechecked. Under those circumstances, Prime95 decides that P1 factoring becomes worthwhile somewhere between 7.25 million and 7.5 million. That assumes of course that the sieving depth is still 50 bits, it could well be around 51 bits by then. (It is currently about 50.1, I think the program will accept a floating point value.)
Double checking is currently almost complete to 1.25M (for 40291 only), and I am continuing on the range 1.251.40M. We have done spot checking in higher ranges, but we'll probably continue to discuss what sort of combination of systematic double checking and spot checking makes the most sense. 
20100208, 18:46  #11 
"Phil"
Sep 2002
Tracktown, U.S.A.
3×373 Posts 
I am doing an experiment, doing some P1 factoring starting at 6.2 million with bounds B1=25000 and B2=200000. I will remove candidates from the prp work files as I find factors. Worst case, this doesn't pay off until (and if) we double check, but at least we will eliminate some prp tests at a time that the sieving has slowed down. At any rate, I will be interested in the data, and will post the factors as I find them. It will be interesting to see how many factors would have been found with a little more sieving, and how many would not be found for quite a while.

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Sieving discussion thread  jasong  Twin Prime Search  311  20101022 18:41 
PRP discussion thread  philmoore  Five or Bust  The Dual Sierpinski Problem  83  20100925 10:20 
Sieving discussion thread  philmoore  Five or Bust  The Dual Sierpinski Problem  66  20100210 14:34 
Theological Discussion Thread  clowns789  Soap Box  3  20060309 04:05 
New Sieve Thread Discussion  Citrix  Prime Sierpinski Project  15  20050829 13:56 