View Single Post
Old 2008-03-26, 21:22   #23
gd_barnes's Avatar
May 2007
Kansas; USA

5·2,017 Posts

Originally Posted by Mini-Geek View Post
Your extrapolation of my CPU time is practically precisely what I recorded. Task Manager told me it was ~17.5 CPU hours per instance when I did the things, and I could have made my calc's easier with using your reverse-engineering. I calculated ([total pairs]/(([total pairs]-[pairs remaining])/[CPU hours]))/24=. For reference, yours would be 1/([completed n]/160000)*[CPU hours]/24=.
Mine would probably be a little more precise due to varying remaining pairs per n as n increases, but yours is far easier.

A note on the exact sizes of the ranges (as I put it in off the top of my head, before I did all the calculations which used the exact sizes): They're 12969 and 22674. You may want to plug that in to your spreadsheet and see if it changes a significant amount.

It's on a dual-core 2.5 GHz Athlon ( if you want to know exact). You may or may not consider that high-speed. I think the ranges are too large currently.

Will do. I'll be sure to get the exact, or closest estimate, time each range finishes, and I can look back to when I reserved for very close estimates of when I started them. I might be able to get CPU time, too, but I'm not sure my computer won't be rebooted or I won't restart an LLR instance before then.

This is good news because the base used for the calculations (12000-candidate file) now has more candidates meaning that the divisor for other-sized files increases, which reduces the testing time for those files. Here are corrections based on your exact # of candidates:

No change to the smaller range because I used n-range calculations, not # of candidates. It should still take 8.027126 days to do n=100K-260K for a file with 12969 candidates.

For your larger range, it would actually take LESS time even if the file-size was still 25000 candidates because the base used for calculating it contained a smaller # of candidates. But it'll be even MORE LESS (lol) since it contains < 25000 candidates. should take you 22674/12969*8.027126 = 14.03401 days for a file with 22674 candidates.

For an average sized file, it would take 19563/12969*8.027126 = 12.10846 days for an average-sized file of 19563 candidates.

12 vs. 13 days is more in the ball park and is about what I'd expect for a drive 3 file at n=~380K so not far from what we had originally hoped for.

The slight problem here is the variability in file size. I personally think we're OK at this point. Anon is posting the # of candidates by each file so people with slower machines can take smaller files and with faster/more machines can take larger files.


Last fiddled with by gd_barnes on 2008-03-26 at 21:28
gd_barnes is offline   Reply With Quote