![]() |
|
|
#386 | |
|
"Aleksandr"
Nov 2005
Russia
2×32×5 Posts |
Quote:
M100000007 interim Wc1 residue C9EC34F5A4868D9D at iteration 81000000 M100000007 interim Wc1 residue 89ED9307520EC9B5 at iteration 82000000 M100000007 interim Wc1 residue B95F28BE15BCF118 at iteration 83000000 M100000007 interim Wc1 residue C579A44EE1E8A267 at iteration 84000000 M100000007 interim Wc1 residue E7D5892EF9B96473 at iteration 85000000 M100000007 interim Wc1 residue B601A50BD1A9B820 at iteration 86000000 M100000007 interim Wc1 residue 9A5D2155D4F1DA64 at iteration 87000000 M100000007 interim Wc1 residue 656CC0B5E907F76B at iteration 88000000 M100000007 interim Wc1 residue 7C1977531DB496E0 at iteration 89000000 M100000007 interim Wc1 residue 78BB4E28340D1C3C at iteration 90000000 M100000007 interim Wc1 residue 0BA6D8CEF8074184 at iteration 91000000 M100000007 interim Wc1 residue 29E374CF3884D508 at iteration 92000000 M100000007 interim Wc1 residue 98AF4F33CE8144EB at iteration 93000000 M100000007 interim Wc1 residue D0A63C48001000E9 at iteration 94000000 M100000007 interim Wc1 residue 2FDC36A96443E63F at iteration 95000000 M100000007 interim Wc1 residue 1398D2CE943EF2CE at iteration 96000000 M100000007 interim Wc1 residue 91FBB561B5F611E7 at iteration 97000000 M100000007 interim Wc1 residue 29914F42E3FB4874 at iteration 98000000 M100000007 interim Wc1 residue 4C902F2158EE3836 at iteration 99000000 |
|
|
|
|
|
|
#387 |
|
Jan 2006
Tampa, Florida
2×97 Posts |
Thanks!!!
|
|
|
|
|
|
#388 |
|
Jan 2006
Tampa, Florida
2×97 Posts |
During P-1 factoring of M332,192,831, I encountered a strange phenomenon. When stage 1 was exactly 20.950553% complete, the factoring slowed down by a factor of 4. The iterations between screen outputs has been set to 1 the entire time, but at exactly 20.950553%, it began displaying every 30 - 35 seconds instead of every 1 second, as it had up until this point. Although the percentage increased by 8 for each display, this still results in a factoring of 4 times longer than the original estimated time. I flat out refuse to spend a year P-1 factoring for a 3.22% chance of finding a factor. I will attempt P-1 factoring with other parameters that I manually enter (Prime95 automatically selected B1=4,880,000 and B2=4,880,000) and see what happens. However, I think that this is something George should know, as it may be a bug in the software. Nobody has ever ventured into this region of testing before, so it is brand new observation. If it is a bug, I cannot proceed to test M332,192,831 until it is fixed. I have attached a jpeg image of Prime95 at the point of this transition. Note that I never had any such problem when I tested (and P-1 factored) M100,000,007. This is the first time I have ever seen or heard of this.
Last fiddled with by StarQwest on 2007-05-23 at 05:11 |
|
|
|
|
|
#389 |
|
Dec 2003
Hopefully Near M48
2·3·293 Posts |
This may sound obvious, but have you tried rebooting your computer? If that doesn't work, maybe if you clean up temporary files?
|
|
|
|
|
|
#390 |
|
Banned
"Luigi"
Aug 2002
Team Italia
10010110011112 Posts |
Sounds like a hard disk thrashing...
Luigi |
|
|
|
|
|
#391 |
|
Jan 2006
Tampa, Florida
2·97 Posts |
I have tried cleaning up temporary files, rebooting my computer, and defragmenting my hard drive. I first noticed it a few days ago at about 22% (I hadn't checked it since 20%). I this exited the program and worked on my computer to clean things up a bit, and began the program from the saved backup at 18%, watching closely to see if this happened. It happened then at 20.950553%. I thought the first time that it only happened when I exited the program and restarted it (because I never actually saw the transition), but this time I saw the transition and it had nothing to do with anything I did. I am now performing iterations on M332,192,831 to see what will happen. So far the first 30,000 iterations have been performed flawlessly at a rate of 0.943 seconds per iteration.
|
|
|
|
|
|
#392 |
|
Dec 2003
Hopefully Near M48
2×3×293 Posts |
Does the transition happen at the same percentage completion each time?
|
|
|
|
|
|
#393 | |
|
Jan 2006
Tampa, Florida
3028 Posts |
Quote:
Yeah. I think maybe B1 was just set way too high by Prime95. I am now rerunning the P-1 test with B1=1,000,000 and B2=100,000,000. This will allow stage 1 to finish in just 18 days (instead of 90 days as before). Also, I think B1 is supposed to be smaller than B2, although I am not sure. I will see if this happens with these parameters at this percentage (it will reach 20.95% in about 3 days). The first 30,000 iterations of the LL test performed flawlessly. Hopefully there will be no further complications or I may have to stop the test until a newer version of Prime95 is released. |
|
|
|
|
|
|
#394 | |
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
22·3·641 Posts |
Specifying B1 = B2 (or Prime95 choosing B1 = B2) is just the way of communicating "do only Stage 1".
If Prime95 is choosing the bounds, it'll never choose B1 > B2. (If Prime95 gets B1 > B2 specified by the user, it just resets B2 equal to B1, and then just does Stage 1 as though B1 = B2 were specified.) Quote:
For comparison, for M33,2xx,xxx exponents it chose B1=B2=495,000, 500,000 or 550,000 (probably depending on the type of CPU) for the cases where available memory wasn't enough to justify a Stage 2. It means that Prime95 calculates that those are the most efficient limits to use for P-1 to maximize the overall GIMPS throughput for that exponent. For instance, if spending 1% of the total L-L time on factoring has a 2% chance of finding a factor then that's a bargain. In fact it would be worth spending up to 2% of the total L-L time on factoring in order to have a 2% chance of finding a factor and thus eliminating the need for the L-L. Also, by "total L-L time" I mean the time required to do both the first-time and double-check L-L. So it's worth spending up to 4% of the time for a single L-L on factoring (before the first L-L) if there's a 2% chance of finding a factor. Using B1=1,000,000 and B2=100,000,000 will have a lower efficiency ratio in this respect than B1=B2=4,880,000. Either it will take longer but not have an equally greater chance of finding a factor, or it will have a lower chance of finding a factor but not save enough time to make up for that lower chance. However, the difference in efficiency may be fairly small, and ... Your satisfaction from doing the calculation the way you prefer outranks GIMPS efficiency. Last fiddled with by cheesehead on 2007-05-24 at 20:53 |
|
|
|
|
|
|
#395 | ||
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
22×3×641 Posts |
Quote:
I hope you can give it more than 2000 MB. For 33,xxx,xxx exponents and B2= ~100,000,000 I give it 1500MB, and it could use lots more. Quote:
Last fiddled with by cheesehead on 2007-05-24 at 21:23 |
||
|
|
|
|
|
#396 |
|
Jan 2006
Tampa, Florida
3028 Posts |
Based on that, I only have 1000MB of RAM available so I will revert back to the saved backup at 20% using B1=4,880,000 and B2=4,880,000. Once it slows down at 20.95%, I expect it to tack 4-6 months to finish stage 1. Since it will not do stage 2, this will be acceptable, and if no factor is found, I will perform the LL test.
|
|
|
|
![]() |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| k*2^n-1 primes in 2006 | Kosmaj | Riesel Prime Search | 2 | 2007-01-02 04:31 |
| Supercomputing 2006 | ixfd64 | Lounge | 1 | 2006-11-30 23:34 |
| 2006 | R. Gerbicz | Puzzles | 45 | 2006-09-19 19:32 |
| 2006 Millennium Tech. Prize. | mfgoode | Science & Technology | 0 | 2006-06-17 16:44 |
| March Madness 2006 | Prime95 | Lounge | 20 | 2006-03-21 04:35 |