Quote:
Originally Posted by MiniGeek
Okay, I will. In the mean time, some statistics from what I've got so far:
I've reserved two ranges, one with ~12K candidates, and one with ~25K. At the current rate for the smaller file, it would be 3.26 days, but it will be slowing as n increases (currently at n=~135700 and ~30 seconds per candidate at 7K used FFT).
For the larger file, it's currently coming out to 4.82 days, but of course will slow as well (currently n=~124400, switching between ~20 and ~27 seconds with switching between 6K and 7K FFTs).
I think it's still too early to tell if the files are too large or not.

OK, for range one, I can extrapolate that since you said it would take 3.26 days to do the whole thing that you did an exact multiplication calculation based on the number of candidates processed vs. remaining. For ease, I'll use the nrange total.
Therefore:
You've tested 35700 / 160000 of your nrange or 22.31%.
22.31% of your total estimate of 3.26 days means you've spent 0.727388 of a day so far.
So if you've spent 0.727388 of a day so far, using algebra and incremental analysis, we can calculate that it took you .001452 of a day (I'll call that 'T' for time) to do n=100000100100, T=.001455 to do n=100100100200, etc. with the length of time increasing by the square of n such that it would take you T=.002669 to do n=135600135700 with the total of T for all 100n ranges up to n=135700 being the 0.727388 days that you presumably spent for the range.
Further incrementing and summing up to n=260K shows that it
should take you 8.027126 days to do n=100K260K for a file with 12000 candidates with the final range of n=259900260000 taking .009814 of a day, assuming that you used the straight multiplication to give your original estiamte.
Based on this,
the 25000 candidate range should take 25000/12000 * 8.027126 = 16.72318 days.
And further, the average file size currently posted is 19563 candidates. Based on that,
the average file size should take 19563 / 12000 * 8.027126 = 13.08639 days.
So, it looks like we're in the ballpark but a little large. I suspect that you have a highspeed machine so this average is somewhat larger than I would like but not too bad. If Anon wants to mess with it, if people with slower resources want to chip in and not spend up to 4 weeks on a file, then we could consider splitting up a file or two.
And finally, to prove that the incremental analysis is correct using the beginning and ending increments, the final range should take (260/100)^2 or 6.76 times as long as the first range. If you take the ratio of the time calculated for the beginning and ending ranges, you have .009814 / 001452 = 6.76. So there you have it!
Math is fun!
You'll have to let me know how close this is to the actual amount of time taken for the files.
Edit: I did this relatively quickly in an Excel spreadsheet but I'm pretty confident of its accuracy. But I have to mention that it's still only a relatively rough estimate because the true LLR times jump in fits and spurts. The spreadsheet is a little rough and hard to read but some people may find the algebra and formulas useful. If anyone wants me to post the spreadsheet, I will.
Gary