![]() |
|
|
#89 |
|
"Lennart"
Jun 2007
25×5×7 Posts |
15T-16T done and emailed
Lennart |
|
|
|
|
#90 |
|
"Lennart"
Jun 2007
25×5×7 Posts |
Reserving 23200G-24T Lennart ETA June 12
|
|
|
|
|
#91 |
|
"Lennart"
Jun 2007
112010 Posts |
16050G-19T complete and mailed to Gary
/Lennart |
|
|
|
|
#92 |
|
I quite division it
"Chris"
Feb 2005
England
1000000111012 Posts |
7.5T-7.9T is complete. (I'll email Gary.)
|
|
|
|
|
#93 |
|
May 2007
Kansas; USA
242548 Posts |
Lennart or Max,
Can you send me the current sieve file with the n=50K-400K range removed in an Email or provide me with a link to it? I'm going to use the remaining n=400K-1M range to compute optimum sieve depth. Lennart, as fast as you're going through the ranges, if we can make it to P=30T sometime between the 15th and 20th, I may suggest doing that; assuming that the optimum is P>25T using the above range. (I think it will be.) If we do that, I could likely do a final P=500G range that would complete in 3-4 days. Everyone, I know it seems like we keep shooting for a moving target here but in this case, it clearly makes sense to be flexible. The target was more time-based than it was sieve-depth based (because I figured we'd be in the ballpark by the 20th) to get some lower top-5000 tests to begin searching on the servers but if the math says to go deeper, we'll do that since it's within the original time constraint. It's amazing the large efficiency gain by sieving this much larger k and n-range all at once. None of our previous drives had n<=600K sieved above P=20T. Back when the project first started for k=400-1001, optimum for n=260K-600K was P=5T on 32-bit machines. For k=1003-2000, we sieved n=50K-500K and n=500K-1M completely separately and were intent on "breaking off" n=500K-600K and so only used it's optimum search depth, which was in the P=~20T range. I had no clue that we would blow throught that range so fast or I would have had them sieved together. I determined then that from here on out, all k-ranges would get sieved in one big huge chunk for n=20K (or whatever) to n=1M. Now with this drive, even if you remove the n=50K-400K range, the much larger 1400k (vs. 1000k or 600k) k-range and n=600K (vs. 400K or 500K) n-range makes for an optimum depth even higher; likely P=25T-30T for LLRing up to n=600K. It may be P>60T for n=600K-1M. (I stopped k=1003-2000 at P=45T since we are only LLRing k=1400-2000.) It's amazing how far the software and machines have come in just 18 months! ![]() Gary Last fiddled with by gd_barnes on 2009-06-10 at 09:57 |
|
|
|
|
#94 |
|
"Lennart"
Jun 2007
21408 Posts |
20T-21T done and mailed to Gary
Reserving 24T-25T ETA June 13 /Lennart Last fiddled with by Lennart on 2009-06-11 at 00:29 |
|
|
|
|
#95 |
|
May 2007
Kansas; USA
22·19·137 Posts |
Sven, I am getting an error when I attempt to pull up your file using WinRAR or WinZIP. It's an unusual error that I have never encountered before. Here it is: Code:
! C:\Users\test\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.IE5\CA5ETZDF\factors%2012400-12650[1].zip: Unknown method in factors 12400-12650.txt Can anyone attempt to pull up Sven's file? If someone can get it pulled up, please save it off and Email it to me. Sven, if neither you nor anyone else can pull the file up from your posting here, I'm going to have to ask you to Email me another one unzipped. There must be something wrong with your zipping process. I tried this 4 times and got the same error each time. Edit: I googled this error and it says I may not have the latest version of WinRAR, which I did not. I upgraded to the latest version but am still getting the same error. Edit2: I just now tried this on one of my Linux machines. It says: "An error occurred while extracting files." In another box, it says "unsupported compression method 98". Clearly a compression method has been chosen that even some of the newest common compression software is unfamiliar with. After 20-30 mins. of messing with this, if anyone can enlighten me or just send the file to me, that would be great. Better yet, the file is small enough that it doesn't need to be zipped if sent in an Email. Just attaching the full file to an Email would work great. Thanks, Gary Last fiddled with by gd_barnes on 2009-06-11 at 06:02 |
|
|
|
|
#96 |
|
May 2007
Kansas; USA
22·19·137 Posts |
I just now got the file from Karsten. Karsten, what compression software are you using? I'm still baffled.
|
|
|
|
|
#97 | |
|
Aug 2008
Good old Germany
3×47 Posts |
Quote:
But am I right, that you now have the file correct? |
|
|
|
|
|
#98 | |
|
Mar 2006
Germany
22·727 Posts |
Quote:
it has some packer/unpacker included so this tool can handle a zip/rar/arj/7z like a folder: insert/delete/copy from a compressed file as easy as can be! unpacked Svens file (no problem) and packed again with this tool. BTW: i'm using this for FTP-upload for my database, too! PS: factors for 6500G-7000G sent to Gary! Last fiddled with by kar_bon on 2009-06-11 at 07:10 |
|
|
|
|
|
#99 | |
|
May 2007
Kansas; USA
22×19×137 Posts |
Quote:
That is very strange. What would you consider the "normal" method? The latest version of WinRAR should be able to handle anything that WinZIP puts out. Yes, Karsten sent me the file. Thanks for checking. Compression software has become way overly complicated these days. I miss the days of only WinZIP.
|
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Team drive #10 k=1400-2000 n=500K-1M | gd_barnes | No Prime Left Behind | 61 | 2013-01-30 16:08 |
| Team drive #12 k=2000-3000 n=50K-425K | gd_barnes | No Prime Left Behind | 96 | 2012-02-19 03:53 |
| k=2000-3400 k's to be pulled from upcoming drives | gd_barnes | No Prime Left Behind | 11 | 2009-06-12 21:28 |
| Sieving drive for k=1003-2000 n=500K-1M | gd_barnes | No Prime Left Behind | 160 | 2009-05-10 00:50 |
| Sieving drive for k=1005-2000 n=200K-500K | gd_barnes | No Prime Left Behind | 118 | 2009-01-17 16:05 |