![]() |
[QUOTE=Batalov;451276]Can you run [B]45 P-1 tests[/B] (which will remove 1 candidate -- based on your own 2.2% value) faster than 1 LLR test?
[/QUOTE] If fact I can: I can run [B]90 tests[/B] with B1=20000 and B2=700000 for time [B]of one LLR [/B]test on 4*155^930xxx+1 One P1 test is done about 345 seconds in and one LLR test is done in 31300 second |
If that's true, that you can run 90 tests at B1=20000 in the time of one LLR test, then a P-1 pass on your entire input file will save you overall about half the time spent on P-1. If you increase B1 to 30000 (with B2 perhaps 1e6) for your one P-1 pass, you should find a success rate better than you had for 20000, and might save even more time.
|
[QUOTE=VBCurtis;451282][B]If that's true[/B], that you can run 90 tests at B1=20000 in the time of one LLR test, then a P-1 pass on your entire input file will save you overall about half the time spent on P-1. If you increase B1 to 30000 (with B2 perhaps 1e6) for your one P-1 pass, you should find a success rate better than you had for 20000, and might save even more time.[/QUOTE]
I am now experimenting with B1 70000 and B2 700000. I have small number of processed candidates, but it looks like chance for removing is little over 2.2% ( now is about 2.6%) And yes, I will run P-1 on whole input file by time when candidate exceed 2M digits. It doesnot really matter if I lost few hours or even day in that, I will also removed some candidates. And since P-1 is trusted method, and all factors found by P-1 are really factors, then I am happy with this method :) P.S I noticed not only in this forum but also in many other forums that are members of very distrustful of facts that can be easily proved or disproved in a few minutes :) (thinking out loud) If I lie, I lie to myself not to you, since you will not use P-1 ,so you will not loose time at all :) |
Last sequence finished.
From 4946 candidates removed 160. so removal rate is 3.2 %. Values was: B1=70000 B2=700000. Time on P-1 for one candidate is 137 seconds. ( 2 cores) This sequence has algebraic factorization , so maybe that is reason of increased removal rate. [B][SIZE=3] [/SIZE][/B] |
I just did some timings for 4*155^930000+1
P-1 B1=70k, B2=700k: 351 seconds PRP test: 12600 seconds The P-1 test takes 2.8% of the time of the prp test. We need a factor rate of >2.8% after sieving. You claim to have 3.2% which sounds good. However, if there are algebraic factors then the rate will be doubled(assuming two factors). Your rate is probably actually 1.6% per factor. There are potentially applications for P-1 at CRUS though. Testing on 1597*6^n-1 has reached n=5M. At this level: P-1 B1=70k, B2=700k: 700 seconds PRP test: 105000 seconds The P-1 test takes 0.67% of the time of the prp test. We need a factor rate of >0.67% after sieving. This should hopefully be possible even with deeper sieving. |
[QUOTE=henryzz;451332]I just did some timings for 4*155^930000+1
[/QUOTE] [URL="https://en.wikipedia.org/wiki/Kozma_Prutkov"]Kozma Prutkov[/URL] once wrote: "Throwing pebbles into the water, look at the ripples they form on the surface, otherwise, such occupation becomes an idle pastime” :rolleyes: Even if you were just throwing pebbles, why not take a realistic example, and not an [URL="http://factordb.com/index.php?id=1100000000895937735"]obviously composite number[/URL]? Additional question for bonus points: now that have you done this pebble, are you any closer to proving or disproving the claim that started this vanity thread: "[COLOR=DarkRed]Very light P-1 factoring would further significantly reduce the likelihood of an unfactored, large exponent term with algebraic factors[/COLOR]." It certainly did in this case, right? You lightly factored this candidate, and [URL="http://www.goodreads.com/quotes/3679-if-the-doors-of-perception-were-cleansed-every-thing-would"]the doors of perception were suddenly cleansed[/URL] ... and you saw this number for what it was - a composite. :book: |
[QUOTE=henryzz;451332]
PRP test: 12600 seconds [/QUOTE] Tell me what CPU has that timing ( how many cores)? [QUOTE=henryzz;451332] Your rate is probably actually 1.6% per factor. [/QUOTE] If I take number of candidates I test with P-1 and number of found factor, and that give me 3.2 % then it is 3.2% It cannot be 1.6 % And also not every range I take has algebraic factors or not every range was tested with same B1 and B2 values. |
[QUOTE=Batalov;451334][URL="https://en.wikipedia.org/wiki/Kozma_Prutkov"]Kozma Prutkov[/URL] once wrote:
"Throwing pebbles into the water, look at the ripples they form on the surface, otherwise, such occupation becomes an idle pastime” :rolleyes: [/QUOTE] Kuzma Prutkov also say this: If you want to be happy, be so :smile: No harm was done with P-1 factoring, and if that make me happy , then I will be happy :) Not every one here has computing power, like you have, dont forget that fact :razz: |
[QUOTE=pepi37;451338]Tell me what CPU has that timing ( how many cores)?
If I take number of candidates I test with P-1 and number of found factor, and that give me 3.2 % then it is 3.2% It cannot be 1.6 % And also not every range I take has algebraic factors or not every range was tested with same B1 and B2 values.[/QUOTE] My 1.6% assertion was based upon every number tested having two algebraic factors. This was on one core of a skylake 6700k at 4GHz. |
[QUOTE=henryzz;451342]
This was on one core of a skylake 6700k at 4GHz.[/QUOTE] It is time to throw out my "old "Intels.... :smile: |
[QUOTE=pepi37;451343]It is time to throw out my "old "Intels.... :smile:[/QUOTE]
I recently upgraded from a Q6600. 5x faster per core. |
| All times are UTC. The time now is 09:59. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.