![]() |
[QUOTE=gd_barnes;468111]Edit: Because of (now) two different people's experience with trying to sieve under-sieved files when BOINC reserves them in the middle of such efforts, I have decided to no longer accept well under-sieved files for posting on the pages. The original idea behind posting under-sieved files on the pages is that they would be a starting point for people to continue sieving before beginning testing. That is not working out very well.[/QUOTE]
One possible remedy is to take down the posted sieve file when someone makes a sieve reservation. Future sieve efforts are only going to get deeper as more bases are tested past 500k, and it's reasonable for someone to sieve to, say, 50T and send it to you for someone else to finish. That doesn't seem to happen all that often, but if I were you I'd reject files sieved to less than half of what you'd want to reduce your workload. Alternately, you could add a text line in the sieve file itself "DO NOT TEST WITHOUT FURTHER SIEVING", which could act as a signal to BOINC to not pick that file. |
I'm going to go ahead and start on S27. It's possible I may need to give it up at some point, but I can at least give it a significant boost. I grabbed the current sieved file.
|
[QUOTE=VBCurtis;468142]One possible remedy is to take down the posted sieve file when someone makes a sieve reservation. Future sieve efforts are only going to get deeper as more bases are tested past 500k, and it's reasonable for someone to sieve to, say, 50T and send it to you for someone else to finish. That doesn't seem to happen all that often, but if I were you I'd reject files sieved to less than half of what you'd want to reduce your workload.
Alternately, you could add a text line in the sieve file itself "DO NOT TEST WITHOUT FURTHER SIEVING", which could act as a signal to BOINC to not pick that file.[/QUOTE] The former is along the lines of what I was thinking for future sieve files. Good idea. The latter is an excellent idea. I'll do that immediately for S27 while wombatman works on it. |
[QUOTE=wombatman;468145]I'm going to go ahead and start on S27. It's possible I may need to give it up at some point, but I can at least give it a significant boost. I grabbed the current sieved file.[/QUOTE]
Great. I'll reserve it for you in the first post here and post a message on the page that the current file is not yet sieved deeply enough. Suggestion at least P=100T or possibly P=250T if you have the resources. Thanks! |
[QUOTE=gd_barnes;468155]Great. I'll reserve it for you in the first post here and post a message on the page that the current file is not yet sieved deeply enough.
Suggestion at least P=100T or possibly P=250T if you have the resources. Thanks![/QUOTE] I'll definitely go to at least P=100T. Current ETA for 20T is the morning of the 21st, just to give some idea of the time. I'm sure it'll speed up a bit as candidates are removed. |
S618 100-200k
1 Attachment(s)
Here's S618 sieved to P=12T. I stopped there because it had reached the needed time per factor to make it competitive with PRP testing (~250 sec/factor).
|
I can make it official:
We (Yoyo from RKN and me) are atm testing an potential BOINC effort on sieving bases. They are 1,7K tests available, (atm only Win64) and your welcome to test the app. Just download BOINC and add "yoyo@Home" as project. On the website go to "Your profile->Yoyo settings and mark "sieve" as project. Anyway, reserving S320 from n=500K-1M. |
As long as you have a system for confirming reported factors are actually factors, it seems this could help. Missed factors due to unreliable BOINCers aren't a big deal, but false factors are.
Note that you should run any sieve yourself up to some fairly-high level, to guarantee all small factors are found; something like 5% of expected sieve depth should do it. Also, the nature of the sieve programs lends itself well to sieving large ranges of exponents at once; if you're going to bother with BOINC, I suggest a maximum exponent 3-5x the minimum exponent. For instance, I'm sieving 500k to 3M for R3, and 300k to 2M for R327. Merely doubling n each sieve is a waste of sieve resources for these low-weight k values. |
[QUOTE=VBCurtis;469216]As long as you have a system for confirming reported factors are actually factors, it seems this could help. Missed factors due to unreliable BOINCers aren't a big deal, but false factors are.
Note that you should run any sieve yourself up to some fairly-high level, to guarantee all small factors are found; something like 5% of expected sieve depth should do it. Also, the nature of the sieve programs lends itself well to sieving large ranges of exponents at once; if you're going to bother with BOINC, I suggest a maximum exponent 3-5x the minimum exponent. For instance, I'm sieving 500k to 3M for R3, and 300k to 2M for R327. Merely doubling n each sieve is a waste of sieve resources for these low-weight k values.[/QUOTE] Yoyo decided to generate 2 tasks/WU, he is now working on a validator. The risk of missing/false factors should be really low. For single-k bases I´ll sieve up to P=500G or 1T before loading it on BOINC server. For the first 1-2 months I´ll focus on sieving "finished" sieve files deeper. (e.g. S1009; S1004; etc) The file for S320 just passed 20T, nearly the half from the goal. In just 1 day! |
[QUOTE=MisterBitcoin;469230]The file for S320 just passed 20T, nearly the half from the goal. In just 1 day![/QUOTE]
I think you need to work on your computations of optimal sieve depth. For R327, I'm closing in on 400T while testing is under n=500k. If you're sieving 500k to 1M, I think you ought to be at 500T or higher so that the seconds per factor removed is roughly equal to the time a single primality test takes at roughly the midpoint of the file (in this case, n=750k, maybe 800k*). A single test for S320 at n = 750k takes a *long* time. Check it out! *tl;dr version: For sieve files where every candidate will be tested, the sec/factor rate is compared to the average test length for the entire file. A side effect of tests scaling with the square of exponent is that the mean test length is equal to the time for a test at 70% of the way from n-min to n-max; in S320 case, 850k is 70% of the way from 500k to 1M. However, we don't expect to test the entire file, since finding a prime means not using the later tests. I don't have a precise way to compensate for this, so I just reduce the sample-n primality test to something below 850k. Thus, 750 to 800k as the sample n. |
Keep in mind that this is sieving being done for our BOINC efforts. The BOINC testing folks are not much concerned about sieve depth. Therefore I have generally suggested (and allowed) that files for them only be sieved to 10-30% of their typical optimal depth since we are using non-BOINC efforts to sieve for BOINC testing.
Alternatively I suppose if it is a BOINC effort doing the sieving then the BOINC sieving folks wouldn't mind getting additional work-unit credits for sieving so, MisterBitcoin, perhaps we could attempt to sieve those to close to optimal depth. For this project, I generally suggest testing a candidate at 60% of the n-range for comparison when determining what factor removal rate should be optimum for sieving. MisterBitcoin if you want to run a partial test at n=800K for S320 to see what the optimal removal rate should be that might not be a bad idea. All you need to do is run a few 1000 iterations of either LLR or PFGW and see what the average iteration time is. LLR will display this. Then multiply that iteration time times the number of total iterations to determine total test time. That should be your optimum sieving factor removal rate for the range. You might find that the optimum is P=500T for n=500K-1M. This would be a huge task for a non-BOINC sieving effort but a mostly trivial one for a BOINC sieving effort. |
| All times are UTC. The time now is 22:49. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.