mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > No Prime Left Behind

Reply
 
Thread Tools
Old 2008-03-26, 21:22   #23
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

100111100010102 Posts
Default

Quote:
Originally Posted by Mini-Geek View Post
Your extrapolation of my CPU time is practically precisely what I recorded. Task Manager told me it was ~17.5 CPU hours per instance when I did the things, and I could have made my calc's easier with using your reverse-engineering. I calculated ([total pairs]/(([total pairs]-[pairs remaining])/[CPU hours]))/24=. For reference, yours would be 1/([completed n]/160000)*[CPU hours]/24=.
Mine would probably be a little more precise due to varying remaining pairs per n as n increases, but yours is far easier.

A note on the exact sizes of the ranges (as I put it in off the top of my head, before I did all the calculations which used the exact sizes): They're 12969 and 22674. You may want to plug that in to your spreadsheet and see if it changes a significant amount.

It's on a dual-core 2.5 GHz Athlon (http://www.newegg.com/product/produc...82E16819103778 if you want to know exact). You may or may not consider that high-speed. I think the ranges are too large currently.

Will do. I'll be sure to get the exact, or closest estimate, time each range finishes, and I can look back to when I reserved for very close estimates of when I started them. I might be able to get CPU time, too, but I'm not sure my computer won't be rebooted or I won't restart an LLR instance before then.

This is good news because the base used for the calculations (12000-candidate file) now has more candidates meaning that the divisor for other-sized files increases, which reduces the testing time for those files. Here are corrections based on your exact # of candidates:

No change to the smaller range because I used n-range calculations, not # of candidates. It should still take 8.027126 days to do n=100K-260K for a file with 12969 candidates.

For your larger range, it would actually take LESS time even if the file-size was still 25000 candidates because the base used for calculating it contained a smaller # of candidates. But it'll be even MORE LESS (lol) since it contains < 25000 candidates. So...it should take you 22674/12969*8.027126 = 14.03401 days for a file with 22674 candidates.

For an average sized file, it would take 19563/12969*8.027126 = 12.10846 days for an average-sized file of 19563 candidates.

12 vs. 13 days is more in the ball park and is about what I'd expect for a drive 3 file at n=~380K so not far from what we had originally hoped for.

The slight problem here is the variability in file size. I personally think we're OK at this point. Anon is posting the # of candidates by each file so people with slower machines can take smaller files and with faster/more machines can take larger files.


Gary

Last fiddled with by gd_barnes on 2008-03-26 at 21:28
gd_barnes is offline   Reply With Quote
Old 2008-03-26, 21:27   #24
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

2×3×7×241 Posts
Default

Quote:
Originally Posted by Mini-Geek View Post
Code:
27*2^100000-1 is not prime.  LLR Res64: B033EEE9274FF1F1  Time : 16.464 sec.
29*2^100000-1 is not prime.  LLR Res64: 7DF93B2FC5EF6077  Time : 16.583 sec.
27*2^100012-1 is not prime.  LLR Res64: 69F091E95935EE0E  Time : 16.588 sec.
27*2^100013-1 is not prime.  LLR Res64: CDAAE5C5F00C6FF5  Time : 16.586 sec.
31*2^100019-1 is not prime.  LLR Res64: 418D2ADE4901D642  Time : 16.610 sec.
31*2^100039-1 is not prime.  LLR Res64: FB1F60CAE7646CB3  Time : 16.599 sec.
31*2^100061-1 is not prime.  LLR Res64: 780E5645B0395BA2  Time : 16.591 sec.
31*2^100129-1 is not prime.  LLR Res64: CD3F553F0B3B61B4  Time : 16.618 sec.
27*2^100133-1 is not prime.  LLR Res64: 0E506AE171784315  Time : 16.612 sec.
27*2^259952-1 is not prime.  LLR Res64: A1E5B8F7FD1C606A  Time : 107.655 sec.
27*2^259973-1 is not prime.  LLR Res64: 5036B0FD0DFDBE74  Time : 107.636 sec.
31*2^259975-1 is not prime.  LLR Res64: D6A76B5B89DA91F8  Time : 107.720 sec.
27*2^260000-1 is not prime.  LLR Res64: B5D601E69D7F123F  Time : 107.740 sec.
Would these CPU timings help for a more accurate estimate, or something?
Well, we do know that Carlos machine will take about half the time that your machine will because the LLR timings are about half as much. The incremental analysis remains the same because the increment is still the same. The multiplier for time taken just changes. So he would take 4 and 7 days to do your two files.

But we do prefer not to have overclocked machines for this effort. But if Carlos has run an appropriate torture test and Anon is good with the test, then I'm OK with it. He knows more about how those specific torture tests work for various machines.

Edit: I could attempt to get down to the n=10 or n=1 level of the incremental analysis but the additional accuracy would not be worth it. Technically calculus needs to be used here. Unfortunately my basic calculus is not good enough so I generally resort to algebra and incremental analysis using formulas similar to compound interest calculations. Perhaps Axn1, Geoff, Robert, or even Mini-Geek here could chime in with some calculus that would give as exact of estimate as possible.

Exact CPU timings don't help for specific n's here with the way I did it because it would require an analysis of when FFTlen changes. Total CPU time spent for an n-range is what helps the most. But doing that analysis would give the most exact estimates.


Gary

Last fiddled with by gd_barnes on 2008-03-26 at 21:39
gd_barnes is offline   Reply With Quote
Old 2008-03-26, 21:57   #25
mdettweiler
A Sunny Moo
 
mdettweiler's Avatar
 
Aug 2007
USA (GMT-5)

3·2,083 Posts
Default

Given all the calculations that have just transpired as to how long it will take to do a file, is everyone satisfied with the sizes of the files? As Gary said, since the number of candidates is listed by each file, people can tailor the size of their reservation to the speed of their computer, so I'm thinking that the existing file sizes should be fine.

However, there is another option available to us: Split up the files so that the 3=<k<400 range is available in 3-k chunks, but the 400<k=<1001 range is available in 2-k chunks. Does anyone think this would be a better way to go?
mdettweiler is offline   Reply With Quote
Old 2008-03-26, 22:32   #26
Flatlander
I quite division it
 
Flatlander's Avatar
 
"Chris"
Feb 2005
England

31·67 Posts
Default

The file sizes are fine for me so far. How much slower will it get as the ks get higher?
Flatlander is offline   Reply With Quote
Old 2008-03-26, 22:40   #27
kar_bon
 
kar_bon's Avatar
 
Mar 2006
Germany

AE916 Posts
Default llr-tools

look here:
http://www.mersenneforum.org/showpos...3&postcount=36

download the llrtools.
-insert in 'times.txt' the timings of your CPU for a given FFTlen
for my Quad 2.4GHz i do this for the range 21-25:
Code:
    6144    0.096
    7168    0.116
    8192    0.120
   10240    0.168
   12288    0.207
   14336    0.250
- call 'get_time.exe' with the parameter of the LLR-input file
and the output is like mine:
Code:
--- Quad Q6600 2.4GHz ---
number of (k,n) pairs in file: 19448
estimated total time for LLR testing the whole file: 656024.274 sec
average time per LLR test: 33.732 sec
so these 656000 secs -> ~182 hours -> 7.6 days (so about 1/4 for my quad)!

the other 2 progs in there:
- fft_len gives you all the FFTlen for a given k and n-range
- av_time gives you the average time per LLRtest for a given k and n-range

try it! it's easy to use with many information you get.
karsten

Last fiddled with by kar_bon on 2008-03-26 at 22:41
kar_bon is offline   Reply With Quote
Old 2008-03-28, 16:10   #28
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

2×3×7×241 Posts
Default

Anon,

Would you mind posting the # of candidates by the reservations also?

It'll give us an idea of how much everyone has reserved.

BTW, k=300-400 should be just as error-prone as k=400-1001. After finishing, say, k=400-450 or k=400-500, you might consider starting on k=300-400 to get it 'filled in'. k=1-300 will be the most accurate because in looking at RPS's threads, it appears they've probably double-checked perhaps 25-35% of everything but the efforts were very sporadic and spread out and it's difficult to tell exactly what was truly double-checked.


Gary

Last fiddled with by gd_barnes on 2008-03-28 at 16:11
gd_barnes is offline   Reply With Quote
Old 2008-03-28, 17:53   #29
mdettweiler
A Sunny Moo
 
mdettweiler's Avatar
 
Aug 2007
USA (GMT-5)

3×2,083 Posts
Default

Quote:
Originally Posted by gd_barnes View Post
Anon,

Would you mind posting the # of candidates by the reservations also?

It'll give us an idea of how much everyone has reserved.
Okay, I'll do that. I don't have time right now, but I should be able to get to it later today.

Quote:
BTW, k=300-400 should be just as error-prone as k=400-1001. After finishing, say, k=400-450 or k=400-500, you might consider starting on k=300-400 to get it 'filled in'. k=1-300 will be the most accurate because in looking at RPS's threads, it appears they've probably double-checked perhaps 25-35% of everything but the efforts were very sporadic and spread out and it's difficult to tell exactly what was truly double-checked.


Gary
Okay, that sound good. I'll probably do something along those lines.
mdettweiler is offline   Reply With Quote
Old 2008-04-02, 06:51   #30
gd_barnes
 
gd_barnes's Avatar
 
May 2007
Kansas; USA

2×3×7×241 Posts
Default

reserving k=407-417 (2 files)
gd_barnes is offline   Reply With Quote
Old 2008-04-03, 14:26   #31
Flatlander
I quite division it
 
Flatlander's Avatar
 
"Chris"
Feb 2005
England

31×67 Posts
Default

15 to 19 completed.
No surprises.

Total running time on one core of C2D @ 2925 MHz was c. 7 days 10 hrs.

I've emailed the zipped results because they exceed the 244.1kb limit.
Flatlander is offline   Reply With Quote
Old 2008-04-03, 14:43   #32
Mini-Geek
Account Deleted
 
Mini-Geek's Avatar
 
"Tim Sorbera"
Aug 2006
San Antonio, TX USA

10AB16 Posts
Default

Quote:
Originally Posted by Flatlander View Post
15 to 19 completed.
No surprises.

Total running time on one core of C2D @ 2925 MHz was c. 7 days 10 hrs.

I've emailed the zipped results because they exceed the 244.1kb limit.
How many candidates were in your range? How much was the CPU used? Reason is to compare speed with my A64X2 @ 2500 MHz. It will finish 9-13 within 35 minutes. I had a power outage and fell ~7.5 hours behind. Estimated CPU time is 204 hours (8.5 days), note this is a rather rough estimate as I couldn't grab the CPU time just before the power outage. Real time is about about 8.75 days (234 hours).
Mini-Geek is offline   Reply With Quote
Old 2008-04-03, 15:19   #33
Mini-Geek
Account Deleted
 
Mini-Geek's Avatar
 
"Tim Sorbera"
Aug 2006
San Antonio, TX USA

10000101010112 Posts
Default

9-13 Done. Results and primes attached. It had trouble writing to the lresults.txt file for a little bit there and put it in the other file in the archive.
Edit: Oh yeah, and, no surprises here. All and only all primes known.
Attached Files
File Type: zip 9-13.zip (204.3 KB, 147 views)

Last fiddled with by Mini-Geek on 2008-04-03 at 15:27
Mini-Geek is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Team drive #14: k=600-1001 n=1M-2M mdettweiler No Prime Left Behind 9 2014-09-02 01:21
GPU sieving drive for k<=1001 n=1M-2M mdettweiler No Prime Left Behind 11 2010-10-04 22:45
Doublecheck drive #2: k=300-400 n=260K-600K mdettweiler No Prime Left Behind 0 2010-05-21 00:22
Team drive #3: k=300-400 n=260K-600K gd_barnes No Prime Left Behind 255 2008-11-12 10:43
Team drive #2: k=400-1001 n=260K-333.2K gd_barnes No Prime Left Behind 154 2008-03-31 02:59

All times are UTC. The time now is 18:08.

Mon May 25 18:08:32 UTC 2020 up 61 days, 15:41, 1 user, load averages: 2.28, 2.13, 2.12

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.