![]() |
|
|
#23 | |
|
Quasi Admin Thing
May 2005
2·3·7·23 Posts |
Quote:
So if you would like Gary, I can send you the .abcd files and the checkpoint files for each of the Riesel and Sierpinski bases to you in a short while.I also feels like it is more motivating for me to run the Base 3 conjectures on both my old aswell as my new PC, since it was what my Quad at least where bought for ![]() Take care my friends Kenneth! |
|
|
|
|
|
|
#24 | |
|
May 2007
Kansas; USA
101·103 Posts |
Quote:
OK but P=5T was only meant as a very rough estimate of the optimal sieve depth for breaking off n=100K-150K or 100K-200K for bases 22 and 23. You took an example that I was giving as fact. It is not. It was an example of how it is POSSIBLE that over-sieving occurs. Have you actually LLR'd a candidate and verified the optimum sieve depth for breaking off n=100K-200K or 100K-150K? Regardless, thank you for the files. I'll do what I can with them. The first thing I'll have to do is determine the optimum sieve depth. I will now unreserve them for you. I'm confused. You have reserved Riesel base 3 to k=500M. Why the sudden interest in Sierp base 3? Don't you need to finish Riesel base 3 to k=500M first? Micha is already searching Sierp base 3 to k=200M. Before starting on Sierp base 3, please report a status on Riesel base 3 or let us know if you're releasing it before starting on the Sierp side of things. Thanks, Gary |
|
|
|
|
|
|
#25 | |
|
Jul 2003
wear a mask
22·3·139 Posts |
Quote:
1. The SRB5 project will NOT be solved by n = 2*10^6, but one has to draw the line somewhere and one of the former moderators of the project drew the line there. Some crude calculations suggest that we will have about 90 sequences (out of an initial 500, roughly) remaining when/if we ever reach that point. With 90 seqences, another larger n value can be chosen for the second sieving phase of the project. In some sense, the large dat file lends a sense of stability to the project. A large dat file is something we won't have to worry about re-initializing or changing drastically for roughly 10 years, maybe much longer. 2. Sieving a larger range is more efficient, but how do you choose how large? I have no idea what the initial reasoning was, but looking at our dat file now, it seems that it suits the broad testing interests of the forum community. There are low-n values (n<1000) for people interested in factoring. There are slightly larger values (n approx. 70,000) for people interested in utilizing older, slower machines for double-checking. There are the current lowest untested n-values (260000-300000) that with some effort will net a prime on the top 5000 list (top 1000, actually). And then there are some huge values for the really ambitious tester. If you had the computational firepower, you could find a prime within our dat file that would rank in the top 20 largest primes. Or maybe you would be happier with a top 50 prime, or a top 100 or top 200...Depending on your level of commitment, we have something that could suit your interests. That's a nice aspect of the large dat file. 3. We actually want to solve the conjecture, eventually. By choosing a large n-value and heavily sieving we are doing a lot of work on behalf of future testers. Yes, it's boring, but it IS very useful. These arguments are only meant as justification for large sieving ranges, not a justification of exclusively sieving. Some arguments for testing without reaching the optimal sieve depth: 1. To keep a project going, one has to consider the psychology of the participants. Finding primes is fun. And regularly finding primes, I believe, is the best way to maintain interest in these prime search projects. Consider TPS. They were going gangbusters when they were finding reportable primes. Once they fell off the top 5000 list, it seems a lot of the interest dissipated. (I know primegrid is still testing for them, but that has always seemed like a particularly mindless sort of interest in a project's goals....) 2. Optimal sieve depth takes way too long to achieve, especially with large n-range dat files. The optimal sieve depth for the SRB5 dat file on an Intel Xeon 3.4 Ghz processor is when sieving removes a candidate every 43 hours. That's right, no typo there, 43 hours. At n=2*10^6, tests take 110 hours on these processors. (I calculated these benchmarks two years ago, with slightly older versions of the testing software, but I'm sure any new benchmarks would be of the same order of magnitude). At current testing n-levels, the tests take about an hour. And the expectation for finding a prime is roughly 8000 tests. 8000 hours = 335 days of testing on a single two-year old processor to find a (reasonably large) prime. It would actually be much shorter on today's quad core processors. At the optimal sieve depth, 8000 hours would only find 8000/43 = 186 factors. So, what sounds more appealing? 186 factors or a nice (reasonably large) prime, which coincidentally would remove 18,500 testing pairs from the dat file? (Admittedly, to be honest, if the optimal sieving depth were achieved, the 18,500 figure would be much smaller, but I doubt it would be less than 186). To summarize: 1. There is nothing wrong with an exceedingly large dat file. 2. Finding primes is fun (and still useful to these projects), so don't discourage testing, even at sub-optimal sieve depths. Let the old battle-axes (like hhh on PSP, and myself on SRB5) perform and encourage the heavy sieving (as we understand its importance) but let the prime-hunters do their thing too. Just my two cents on a much discussed topic that will not die... |
|
|
|
|
|
|
#26 | |
|
Quasi Admin Thing
May 2005
2·3·7·23 Posts |
Quote:
About Riesel base 3, all primes for k<=500M for n<=500 has been found and verified. The ETA for the Riesel base 3 range is about 2 weeks, maybe 3. I'm only interested slightly in the Sierpinski base 3, because it actually reduces the manual work since I can do an emediate proof of the PRP and then continue to determined n, if the PRP is in fact not a prime. So less manual work because I don't have to split a ~25GB data file and sift through it and find the PRP's that was actually composites rather than beeing primes :) So in short terms, the interest for Sierpinski base 3 has come because a strict prime search (not PRP search) can be conducted. However if someone can come up with a solution such as I will not end up in an infinite "brillhart lehmer" loop where it tries different base and square roots, then of course I would like to continue with the Riesel base 3... but that is not the plan for now, since it involves a great deal of more manual work than I really feel like putting in any project at the moment. On an other side note, I had actually noted that Michaf is doing k<=120-200M and I was considering to reserve something above that, if I decided to continue doing CRUS work ![]() A lot has been said, and I hope it made sence! Regards KEP |
|
|
|
|
|
|
#27 | |
|
May 2007
Kansas; USA
1040310 Posts |
Quote:
Why are you doing primality proof tests on Riesel base 3? It's not necessary and takes way too long. I've suggested before to just do PRP tests, which are extremely fast. You can do primality proofs after the PRP tests and the manual work. It's unlikely you'll find any composites amongst the PRP's. I'm going to unreserve Riesel base 3 because a test to n=500 is not something that others could start from because the k's remaining and files are too big to administer and send to people. It would take only a moderate amount of CPU time to start over for such an effort. The only thing I'll have you reseved for is Sierp base 19 to n=100K. If you want to reserve Sierp base 3, just specify an k-range but please keep it small (i.e. k<50M) and complete what you reserve to at least n=10K and preferably n=25K. Thanks, Gary Last fiddled with by gd_barnes on 2008-07-06 at 16:56 |
|
|
|
|
|
|
#28 |
|
Quasi Admin Thing
May 2005
11110001102 Posts |
@ Gary:
I think you have quite misunderstood, here is the status of the Riesel Base 3 and how it was done: 1. PRP test all k<=500M to n<=500 (complete) 2. Verify the PRPs (complete) 3. Sieve the remaining k's for n>500 <25,000 (in progress) 4. Take the remaining k's to n<=25,000 (going to start soon) 5. Sieve the PRP's that were not really primes from n>=1 to n<=25,000 6. Test that sieve file and remove all k's being primes As you see, the work is not only taken to n<=500 once it gets complete, but in the inicial face it was ... so please keep my reservation or else I'll lose 6-8 weeks of work down the drain and on nothing :(Regards KEP Ps. I'm not going to work on anything else than the base 19 in the future, since I feel like I'm more or less of an obstical to this project... and of course interest is starting to fade ... happy hunting everyone...
|
|
|
|
|
|
#29 | |
|
Jan 2005
479 Posts |
Quote:
Ad. 2. Optimal sieve depth is already achieved for the n's you are testing at SRB5, sieving more will only just help a little bit on the 'low' ranges. i.e. finding a factor for 123*5^100000 has much less 'impact' then finding a prime for 123*5^1000000, even though both shave time from the 'total time to prp'-time. As long as you are willing to take the project to the 'end' (say 2M), sieving will have use if you get more factors per hour than that you can prp per hour (at 70% of the remaining range) (Gee.. I hope I make at least SOME sense here :)) |
|
|
|
|
|
|
#30 | |
|
May 2007
Kansas; USA
1040310 Posts |
Quote:
I'll be honest here. I'm not sure if you're against what I'm saying, in favor of, or neutral on the issue. Perhaps a little of all 3. I'd like to encourage people to do what they want to do and not to force sieving down people's throats. That's what I feel RieselSieve has done. Up until the last year, they sieved things into the ground. You couldn't even go on their main page and find any kind of work except sieving to work on. You had to go to a forum. I think you may have misinterperted my statement of sieving n-max vs. n-min of 4x as dissing on anyone who does so. Not so. I simply want to save current slower resources in favor of future faster resources if what is being sieved will take longer than ~2 years to complete; a time in which computer speeds typically double. I realize this is a radical point of view and requires some new ways of looking at things. It's kind of like political debates about the economy or the world. Do we do what is best for the next 1-4 years or for the next 10-20 years or for the next 50-100 years...tough choices there. Since this is a recreational activity, I like to look at things over the next 1-4 years because 80-90% of people's specific interests change in that time. Heck, in 4 years, I may wish to get back into Pente, a board game I'm very passionate about. In my personal life, I look at things over 10-20 years, and I feel that in the political and world arena, things should be looked at over 50-100 years. So IMHO, it's a matter of what pursuit or world-view you look at things. I have to commend you on thinking that Base 5 can be proven. That's certainly taking the 50-100 year approach to things. But IMHO, it will not be proven in most of our lifetimes so if it were up to me, I would have suggested sieving n=100K-400K originally like we have started to do for Sierp base 6. In 2-3 years, when LLRing is nearing completion of that range, we can then start sieving n=400K-1M or 400K-1.6M with computers that should be twice as fast on far fewer k's than there will be now. I really like your idea of having many different n-ranges available for testing. I've attempted to do that at NPLB with the different drives and individual-k testing, albeit with quite limited success on the latter of those 3. I had no clue that you had all of these different ranges that could be tested. I thought you just had the two servers, each currently testing somewhere between n=200K and 300K, one of which I worked on for ~7-10 days. Where are these very small and very large n-ranges that can be tested? But I am confused. If there are small n-ranges to be tested, why is that? Wouldn't it better to knock those out and remove huge #'s of candidates from the huge dat file (if a prime is found) if your goal is to ultimately prove the conjectures? If the goal is to prove the conjectures, I think that small n-ranges should all be tested. But perhaps you've determined that the fun that it makes for everyone is more valuable than the proof. I agree there that it's not always easy to strike a balance between the two. I agree that we need to let people do what they like to do but we also have to encourage them to do what we need them to do. I've probably not done a good job of that here at CRUS because I've been a little more focused on NPLB. Gary Last fiddled with by gd_barnes on 2008-07-06 at 21:38 |
|
|
|
|
|
|
#31 | |
|
May 2007
Kansas; USA
1040310 Posts |
Quote:
Apologies Kenneth. The communication interpretation was bad on my end this time. When you said you were taking on Sierp base 3 and had seached k<500M on Riesel base 3 to only k=500, I took that to mean that you were changing efforts. You are not an obstacle. Where I'm coming from is the difficulty that it creates with many changing huge reservations that are coming in on your end. If you make a very large reservation and then stop it, it takes that much more time for me to coordinate it in the future. It really helps out if people complete what they reserve. I understand that things happen (computers crash, range takes much longer than you though, etc.) as long as they aren't happening too frequently. It's also great if you want to sieve n=100K-1M, test n=100K-200K, and leave the remaining file for the team...or like on bases 22 and 23, if you just want to sieve, that's great but it's better if I know that in advance. I don't see fading interest here. As any project has, there was great activity with the start of the new project and it's now simmered off to a relatively consistent level over the last 4 months or so. What I see now is a couple of people on vacation (Anon and Karsten) and a couple of people that have reduced their resources on prime searching for the hot summer months. You might want to peak at the reservations pages and see what you think. It's just that many people are searching at high n-ranges and only report a status once/month or so or when they find prime(s). I will re-reserve k<500M on Riesel base 3 to n=25K for you. Sorry that I misunderstood you. Gary Last fiddled with by gd_barnes on 2008-07-06 at 21:57 |
|
|
|
|
|
|
#32 | |
|
May 2007
Kansas; USA
101×103 Posts |
Quote:
Kenneth, I just now figured it out. The above is why I thought you unreserved Riesel base 3. First you said Riesel base 3 was ETA 3 weeks to completion, then you said 'then of course I would like to continue with the Riesel base 3...but that is not the plan for now since it involves a great deal more manual work...'. It looks like you're saying that is not the plan for now to continue on Riesel base 3. I just thought I'd point out why I got confused. Anyway, it doesn't matter; I'm reserving it again for you. Gary Last fiddled with by gd_barnes on 2008-07-06 at 22:09 |
|
|
|
|
|
|
#33 |
|
May 2007
Kansas; USA
101000101000112 Posts |
To all:
I messed up my prior calculation of k's remaining on Sierp base 19. I had the incorrect # of k's remaining at n=10K. I had used the number that were remainning at n=10.45K. This was a 30-prime difference and made a huge difference in the ensuing calculations. The calculations were all correct to the best of my knowledge. It was the data going in that was incorrect, i.e. garbage in, garbage out. This explains why KEP was finding primes at a faster rate than what I had expected. To be consistent and not "cheat" , I still used the # of primes for n=10K-11.2K for the estimates and no subsequent primes that he found up to n=12.32K.I edited the two incorrect posts above for the correct calculations. This makes the estimate ~636 k's remaining at n=100K and reduces the estimated CPU years needed to complete Sierp base 19 to n=100K from 40 to 31.8. Further sieving on his file will likely reduce the total CPU time needed but not substantially. But better...it reduces the estimated final prime to be found from n=~10^17-10^19 to n=~10^12-10^13. Now it's likely n<10 trillion. Wee! Gary Last fiddled with by gd_barnes on 2008-07-06 at 23:51 |
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Conjectures with one k remaining | rogue | Conjectures 'R Us | 109 | 2017-04-29 01:28 |
| Estimating time needed for GNFS | CRGreathouse | Factoring | 16 | 2014-03-10 03:40 |
| Estimating time needed for GNFS | CRGreathouse | Factoring | 0 | 2014-03-02 04:18 |
| Predicting the needed time for high n-values? | Rincewind | Sierpinski/Riesel Base 5 | 4 | 2009-06-11 12:24 |
| Time needed to factor a 150 digit number | ladderbook | Factoring | 14 | 2008-11-27 13:02 |