![]() |
|
|
#166 | |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
102538 Posts |
Quote:
http://www.mersenneforum.org/showthread.php?t=9906 Do you think I still need to run a stress test? Yeah, sorry I wasn't able to report on the status through the weekend, but I was on vacation and, though I could check my computer was online, couldn't check the status or send factors. As stated earlier, I'll finish it pretty soon. I'm interested in LLRing this range and was wondering how long between when I return my last factors and when it's posted for manual LLRing. Just a rough estimate, I want to know if I should get an other manual LLR or just LLRnet or something. |
|
|
|
|
|
#167 | ||
|
A Sunny Moo
Aug 2007
USA (GMT-5)
141518 Posts |
Quote:
Of course, you may want to run it anyway, maybe for just an hour or so, but it's your call. ![]() Quote:
|
||
|
|
|
|
#168 | |
|
May 2007
Kansas; USA
33·5·7·11 Posts |
Quote:
One more thing...I think we should probably test this by k-value instead of n-value so we'll need to get the file sorted by k-value primary and n-value secondary. That is what I did for my double-checking on n=50K-100K. It will be much easier to cross check for missing and incorrect primes that way. That is unless for some reason we decide to use a heavy hitter on an LLRnet server, in which case we'd probably only want to feed him n>200K to avoid creaming the server. I'm excited about this. After finding ~2% missing primes for n=16K-100K for k=300-1001, I'm looking for an even higher rate of missing primes in this range. (n<16K was highly accurate for this k-range with only 1 error found and I believe it had been previously double-checked.) k<300 has been double-checked quite a bit already and will likely yield a lower rate of missing or incorrect primes. Kosmaj told me that they were attempting to get double-checked any k's that were tested by people outside of RPS but they weren't done yet. I did some double-checking myself but it was very fragmented both in n-values and k-values. What this DOES mean that k's that have been tested by RPS folks have been largely NOT double-checked so who knows what we might find. We do know that k<300 was independently double-checked for n<100K but anything above that, it's not clear. Gary Last fiddled with by gd_barnes on 2008-03-24 at 05:25 |
|
|
|
|
|
#169 | |
|
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
426710 Posts |
Quote:
)
|
|
|
|
|
|
#170 | |||
|
A Sunny Moo
Aug 2007
USA (GMT-5)
624910 Posts |
Quote:
![]() Quote:
Quote:
|
|||
|
|
|
|
#171 |
|
A Sunny Moo
Aug 2007
USA (GMT-5)
3·2,083 Posts |
|
|
|
|
|
#172 | |
|
I quite division it
"Chris"
Feb 2005
England
1000000111012 Posts |
Quote:
The link to P95 v25.3 (Windows) is dead, is there somewhere else I can get a copy. (I assume this is the one I need for stress testing?) Last fiddled with by Flatlander on 2008-03-24 at 17:21 Reason: Blah, blah, blah. |
|
|
|
|
|
#173 |
|
May 2007
Kansas; USA
33×5×7×11 Posts |
It will be a big BIG hassle to check results sorted by n-value vs. k-value. I suppose we could do the search by n-value and then sort the final primes by k-value before checking them but then we'd have to wait until the very end of LLRing to do any checking.
This would not be a problem team-drive style. Micha and I are doing Sierp base 3 at CRUS by k-value because there are > 10^15 k's!! I reserved the first 100 million k's and he reserved the next 10 million k's. (It's a very prime base with only 3 k's remaining at k=3M. My testing is currently past k=8M.) As for splitting the file up, do it in reasonable chunks, perhaps 3 or 6 k's at a time. Keep the # of k's per file at a multiple of 3 since any k that is divisible by 3 is heavier weight. I realize the files will have more variablity in size but it will all even out in the long run. Chris's point about doing things upwards by n-value is a good one for doing first-pass processing and is the way we run our drives here. IMHO, the way RPS is testing up past 1.5-2M on some k's while leaving other k's near them at n=600K-700K, is a waste of current resources. Things should be kept somewhat more level while also allowing a certain degree of individuality to make things fun. But for a second pass, which is always done with faster machines than a first pass, his point only becomes a factor if there is a large amount of time between the beginning and ending of the effort. So for instance if we started now at n=100K and did not anticipate finishing to n=260K for 2 years when computers are certainly going to be faster, then yes, we should search by n-value. But here, I anticipate us finishing in < 6 months if not faster so there is little benefit to searching by n-value. Searching by k-value will allow us to compare entire k's as we go along. We will quickly see the results of our efforts. Also, if it turns out that there is few problems for k<300, we can shift to making k>300 a priority with k<300 done later. Anon, you're running the effort here so if you and others feel strongly about searching by n-value, I'm fine with that. It won't affect how much I help the effort. I just wanted to bring up some points about searching by k-value instead like I did for n=50K-100K. Gary |
|
|
|
|
#174 | |
|
A Sunny Moo
Aug 2007
USA (GMT-5)
3·2,083 Posts |
Quote:
![]() One thing, though: you suggested dishing them out in chunks of 3 k's at a time. Do you have a guess of how long one of those files would take? That would be helpful in determining whether 3 k's at a time is too much for lower-powered users to handle. |
|
|
|
|
|
#175 | |
|
May 2007
Kansas; USA
33×5×7×11 Posts |
Quote:
Not a clue on the amount of time. I will guess that on average, it would be ~10-14 days for 3 k's but maybe 7-10 days. I'll speculate about as long as a drive 3 range at about n=340K. Somewhat large but not too bad. Here's a suggestion that should allow people with all different resources to easily reserve files of the size that they want: Kosmaj does it and it's a very good idea (I bet you never thought you'd hear that, lol). Put the number of k/n pairs by the file link. Since there will be quite a bit of variability in file-size, if you have about 10-20 3-k files posted, people with more resources can take the bigger ones and with less resources can take the smaller ones. So that we aren't testing all over the place at once, posting no more than 20 files at a time keeps it managable. You could do this any myriad of ways. One suggestion might be to post a group of 10 files for k=2-32 and another group of 10 files for k=300-330. Probably the very low k-ranges will be pretty accurate so to get some 'bang for our buck', doing both at the same time would make it more 'fun' so to speak. Not that I need to beat a dead horse anymore but another reason that I now remember why it was so helpful to search by k-value is that some single k-values had several errors while other entire 100-k ranges had none. I found one k that had 3 missing primes. Obviously the same person(s) searched (or didn't search) the completed ranges at PrimeSearch. When we find a missing prime, we'll want to pay close attention to that k-value and any k-values around it that might have been originally tested by the same person(s) at PrimeSearch. There is one main case against searching by k-value that I forgot about. Do we anticipate that an LLRnet server will be used for this AND will there potentially be a large number of participants using that server? All of my logic flies out the window in that case. If we can utilize a server and we have large #'s of resources, we'll finish so quickly that it doesn't really matter how we test. We'll just confirm everything at the end after sorting by k. We could set one up right away but limit the # of people using it at n=100K (i.e. encouraging people to do manual reservations if we near a perceived problem processing point on the server) and then open it up to more people as we progress upwards. THAT is a very good case for searching by n-value!! Gary Last fiddled with by gd_barnes on 2008-03-24 at 20:25 |
|
|
|
|
|
#176 |
|
I quite division it
"Chris"
Feb 2005
England
1000000111012 Posts |
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Is more sieving power needed? | jasong | jasong | 4 | 2012-03-25 19:11 |
| Doublecheck always have shifted S0 value? | ATH | PrimeNet | 11 | 2010-06-03 06:38 |
| All things doublecheck!! | masser | Sierpinski/Riesel Base 5 | 44 | 2006-09-24 17:19 |
| DoubleCheck vs LL assignments | Unregistered | PrimeNet | 9 | 2006-03-26 05:48 |
| doublecheck - results | TheJudger | Data | 4 | 2005-04-04 08:54 |