![]() |
Prime95 P-1/Pfactor using to find factors on CRUS bases
When Masser suggested that[B] very light P-1 factoring would further significantly reduce the likelihood of an unfactored, large exponent term with algebraic factors,[/B] I decide to try on this for me, new area of interest.
I know that Pfactor and Pminus1 is not the same, but it looks like they work similar on Prime95. So I try on my npg sequence that have 2M1 digits and above ( bases 155, 212) in order to find same factors. I run it all night, and in the morning I have found few new factors. So now when I have test data, I try to optimize it, so I run different setting , so find what settings find all factors I have ( in my test data) and to be faster then it was first time. At the end I found that Pfactor=4,212,n,1,45,8 was winning combination. Since Prime95 write in log what is B1 and B2 choose for optimal: I try also to play with Pminus1 option in Prime95. Got pretty much same result. And last night I try to go much more deeper ( B2 is set to 2000000) and in the morning I was surprised that I found only one new factor ( yes I know there is no need to get same B1, B2 setting and process it few times since it will be (almost) always find same factors. I also play with base2 candidates, and in that case ( and since I sieving that sequence with GPU) I found no new factors, but find it is much faster then on any other bases. So I have question: 1. why much more deeper B2 ( from 700000 - 2000000) gives such small number of new candidate? - Yes, it must be some reason why Masser suggested [B]very light P-1 factoring [/B]( in my case, and my experiment suggest values for very light P-1 will be B1=20000 , B2=400000). Some of you maybe thing it is useless to use P-1/Pfactor to find candidates, and maybe sr x sieve will find it faster, but it is experiment, and every candidate less is good thing :smile: So if you gave any idea, suggestion be free to write it , thanks for reading! |
Additional info
From CUDA PPS sieve I know for two factors 451228441149281 | 28561*2^4020784+1 478743809616001 | 83521*2^4011252+1 Prime95 successfully detect one of them , but not the other, regardless,am I using Pfactor or Pminus1 ( and regardless of low boundary value) Pminus1 P-1 found a factor in stage #2, B1=20000, B2=700000, E=6. 83521*2^4011252+1 has a factor: 478743809616001 (P-1, B1=20000, B2=700000, E=6) 28561*2^4020784+1 completed P-1, B1=20000, B2=700000, E=6, We4: 6815670D Pfactor [Mon Jan 16 16:02:46 2017] 83521*2^4011252+1 completed P-1, B1=55000, B2=660000, E=6, We4: 67E373DB 28561*2^4020784+1 completed P-1, B1=55000, B2=660000, E=6, We4: 68157375 It look like I need to learn more :) |
P-1 is effective for the inputs that[B] have a known partial factorization[/B] of all future factors. A trivial example is Mersenne primes, as well as Wagstaff's, GFNs, GUs, GMs, EMs, All of them have factors P, with P-1 known to have 20-25 bits already factored. You get this 20-25 bit boost for free and on top of it you get an additional factor of P-1, and as a result you get a factor P which is above what you can get with TF/sieving.
For most CRUS candidates, you only know that P-1 has a factor 2 (which is no new information, and hence no boost), and a P-1 curve will be as effective as 1 curve of ECM. Occasionally, yes, you will get a factor in line with random probability, |
So sequence like 4*155^n+1 or 4*20^n+1 pr 4*50^n+1 etc etc will gain some boost?
|
Yes, a 1 bit boost. They are "GFNs" of form x^2+1, so all their factors are of form 4k+1.
|
[QUOTE=Batalov;451056] You get this 20-25 bit boost for free and on top of it you get an additional factor of P-1, and as a result you get a factor P which is above what you can get with TF/sieving.[/QUOTE]
So does I do something wrong if I got only 1 bit boost, and you say above, 20-25 bit boost.Seems like I dont use Prime95 properly. |
[QUOTE=Batalov;451056]A trivial example is Mersenne primes[/QUOTE]
Yeah, they factor as 1 times M... :razz: (sorry, I could not stop myself, hehe) |
Your "M" does have a factor M, for which M-1 is indeed = 2kp.
Please by all means continue correcting my missing commas and other typos. You are indispensable in this role. :rolleyes: |
Just do not begin fight , everything else is allowed :smile:
|
After few days of playing using different B1 and B2 values: I came to conclusion that removal rate is about constant 2.2% , so for small numbers it is very slower, much slower then LLR, but on huge numbers ( like Batalov say) it has some sense!
And of course it is good to learn something new. :smile: Thanks masser for point me to P-1. |
[QUOTE=pepi37;451272]I came to conclusion that removal rate is about [B]constant 2.2% [/B], so for small numbers it is very slower, much slower then LLR, ...[/QUOTE]
Can you run [B]45 P-1 tests[/B] (which will remove 1 candidate -- based on your own 2.2% value) faster than 1 LLR test? [QUOTE=pepi37;451272]...but on huge numbers ( like Batalov say) it has some sense! [/QUOTE] I was just being polite. On these, it most likely doesn't. I explained on which candidates "it has some sense". |
[QUOTE=Batalov;451276]Can you run [B]45 P-1 tests[/B] (which will remove 1 candidate -- based on your own 2.2% value) faster than 1 LLR test?
[/QUOTE] If fact I can: I can run [B]90 tests[/B] with B1=20000 and B2=700000 for time [B]of one LLR [/B]test on 4*155^930xxx+1 One P1 test is done about 345 seconds in and one LLR test is done in 31300 second |
If that's true, that you can run 90 tests at B1=20000 in the time of one LLR test, then a P-1 pass on your entire input file will save you overall about half the time spent on P-1. If you increase B1 to 30000 (with B2 perhaps 1e6) for your one P-1 pass, you should find a success rate better than you had for 20000, and might save even more time.
|
[QUOTE=VBCurtis;451282][B]If that's true[/B], that you can run 90 tests at B1=20000 in the time of one LLR test, then a P-1 pass on your entire input file will save you overall about half the time spent on P-1. If you increase B1 to 30000 (with B2 perhaps 1e6) for your one P-1 pass, you should find a success rate better than you had for 20000, and might save even more time.[/QUOTE]
I am now experimenting with B1 70000 and B2 700000. I have small number of processed candidates, but it looks like chance for removing is little over 2.2% ( now is about 2.6%) And yes, I will run P-1 on whole input file by time when candidate exceed 2M digits. It doesnot really matter if I lost few hours or even day in that, I will also removed some candidates. And since P-1 is trusted method, and all factors found by P-1 are really factors, then I am happy with this method :) P.S I noticed not only in this forum but also in many other forums that are members of very distrustful of facts that can be easily proved or disproved in a few minutes :) (thinking out loud) If I lie, I lie to myself not to you, since you will not use P-1 ,so you will not loose time at all :) |
Last sequence finished.
From 4946 candidates removed 160. so removal rate is 3.2 %. Values was: B1=70000 B2=700000. Time on P-1 for one candidate is 137 seconds. ( 2 cores) This sequence has algebraic factorization , so maybe that is reason of increased removal rate. [B][SIZE=3] [/SIZE][/B] |
I just did some timings for 4*155^930000+1
P-1 B1=70k, B2=700k: 351 seconds PRP test: 12600 seconds The P-1 test takes 2.8% of the time of the prp test. We need a factor rate of >2.8% after sieving. You claim to have 3.2% which sounds good. However, if there are algebraic factors then the rate will be doubled(assuming two factors). Your rate is probably actually 1.6% per factor. There are potentially applications for P-1 at CRUS though. Testing on 1597*6^n-1 has reached n=5M. At this level: P-1 B1=70k, B2=700k: 700 seconds PRP test: 105000 seconds The P-1 test takes 0.67% of the time of the prp test. We need a factor rate of >0.67% after sieving. This should hopefully be possible even with deeper sieving. |
[QUOTE=henryzz;451332]I just did some timings for 4*155^930000+1
[/QUOTE] [URL="https://en.wikipedia.org/wiki/Kozma_Prutkov"]Kozma Prutkov[/URL] once wrote: "Throwing pebbles into the water, look at the ripples they form on the surface, otherwise, such occupation becomes an idle pastime” :rolleyes: Even if you were just throwing pebbles, why not take a realistic example, and not an [URL="http://factordb.com/index.php?id=1100000000895937735"]obviously composite number[/URL]? Additional question for bonus points: now that have you done this pebble, are you any closer to proving or disproving the claim that started this vanity thread: "[COLOR=DarkRed]Very light P-1 factoring would further significantly reduce the likelihood of an unfactored, large exponent term with algebraic factors[/COLOR]." It certainly did in this case, right? You lightly factored this candidate, and [URL="http://www.goodreads.com/quotes/3679-if-the-doors-of-perception-were-cleansed-every-thing-would"]the doors of perception were suddenly cleansed[/URL] ... and you saw this number for what it was - a composite. :book: |
[QUOTE=henryzz;451332]
PRP test: 12600 seconds [/QUOTE] Tell me what CPU has that timing ( how many cores)? [QUOTE=henryzz;451332] Your rate is probably actually 1.6% per factor. [/QUOTE] If I take number of candidates I test with P-1 and number of found factor, and that give me 3.2 % then it is 3.2% It cannot be 1.6 % And also not every range I take has algebraic factors or not every range was tested with same B1 and B2 values. |
[QUOTE=Batalov;451334][URL="https://en.wikipedia.org/wiki/Kozma_Prutkov"]Kozma Prutkov[/URL] once wrote:
"Throwing pebbles into the water, look at the ripples they form on the surface, otherwise, such occupation becomes an idle pastime” :rolleyes: [/QUOTE] Kuzma Prutkov also say this: If you want to be happy, be so :smile: No harm was done with P-1 factoring, and if that make me happy , then I will be happy :) Not every one here has computing power, like you have, dont forget that fact :razz: |
[QUOTE=pepi37;451338]Tell me what CPU has that timing ( how many cores)?
If I take number of candidates I test with P-1 and number of found factor, and that give me 3.2 % then it is 3.2% It cannot be 1.6 % And also not every range I take has algebraic factors or not every range was tested with same B1 and B2 values.[/QUOTE] My 1.6% assertion was based upon every number tested having two algebraic factors. This was on one core of a skylake 6700k at 4GHz. |
[QUOTE=henryzz;451342]
This was on one core of a skylake 6700k at 4GHz.[/QUOTE] It is time to throw out my "old "Intels.... :smile: |
[QUOTE=pepi37;451343]It is time to throw out my "old "Intels.... :smile:[/QUOTE]
I recently upgraded from a Q6600. 5x faster per core. |
[QUOTE=henryzz;451550]I recently upgraded from a Q6600. 5x faster per core.[/QUOTE]
Q6600 is really old one :) |
[QUOTE=Batalov;451056]P-1 is effective for the inputs that[B] have a known partial factorization[/B] of all future factors. A trivial example is Mersenne primes, as well as Wagstaff's, GFNs, GUs, GMs, EMs, All of them have factors P, with P-1 known to have 20-25 bits already factored. You get this 20-25 bit boost for free and on top of it you get an additional factor of P-1, and as a result you get a factor P which is above what you can get with TF/sieving.
For most CRUS candidates, you only know that P-1 has a factor 2 (which is no new information, and hence no boost), and a P-1 curve will be as effective as 1 curve of ECM. Occasionally, yes, you will get a factor in line with random probability,[/QUOTE] Can you give me one example of GFN sequence , that I can search with LLR and do srsieve , and where I will got this 20-25 bit boost ( let that sequence be unexplored territory) :)? |
| All times are UTC. The time now is 09:59. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.