![]() |
|
|
#1288 |
|
"Richard B. Woods"
Aug 2002
Wisconsin USA
769210 Posts |
Even more reason to study human nature now, so you'll know what you're messing with (or someone else is messing with).
Last fiddled with by cheesehead on 2012-05-12 at 03:39 |
|
|
|
|
|
#1289 |
|
Basketry That Evening!
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88
3×29×83 Posts |
|
|
|
|
|
|
#1290 |
|
Aug 2002
Dawn of the Dead
23510 Posts |
46 retests done on my range M83xxxxx: 1 factor found, Stage 1 (B1=95000).
I am surprised. I expected to find one or two in Stage 2 given that the first runs were very badly done and back then most people could not allocate close to 2 GB of ram (back then a hot rod would have been a P3 Katmai perhaps with 256 MB total ram). Stage 1 should have found this, or what gives? 1410 tests pending ... |
|
|
|
|
|
#1291 | |
|
"James Heinrich"
May 2004
ex-Northern Ontario
23·149 Posts |
Quote:
The "poorly done" P-1s may or may not have had stage 2 done, but either way the factor probability for that batch is somewhere just over 2%. Assuming you're giving sufficient RAM to achieve a nominal 5% probability, you'd expect around 3% to have factors (since 2% have already been found by the first round of P-1) -- 46 * 0.03 = 1.38, which is pretty close to the 1 you've found already. The ~3% success rate isn't exciting, except by virtue of these exponents being small and being able to chew through a hundred or so of them per day (depending on what you're running it on, naturally). |
|
|
|
|
|
|
#1292 |
|
"James Heinrich"
May 2004
ex-Northern Ontario
23×149 Posts |
|
|
|
|
|
|
#1293 |
|
Aug 2002
Dawn of the Dead
5·47 Posts |
James,
It isn't the incidence of factors, rather it is the finding of a factor in Stage 1. This should have been harvested as low hanging fruit the first time it was factored. Perhaps the client should do more in Stage 1 if there is insufficient ram for a good Stage 2? What I observed was a factor with B1=95000; the data on your page indicated B1=65000 for the first run. I'll continue doing my range and I may try a hundred or so using the Pminus1 argument. Can you comment on bounds? I have 1920 MB ram allocated, server assigned default M5xxxxxxx assigments grab 1880 MB (the other core uses some for LL testing) and the M83xxxxx tests default to 1735 MB for bounds of B1=95000 and B2=1686250. |
|
|
|
|
|
#1294 | |||
|
"James Heinrich"
May 2004
ex-Northern Ontario
65438 Posts |
Quote:
Quote:
B1=600000, B2=14375000 -> 6.00% chance of factor, 4.35GHz-days but, if you have insufficient RAM and therefore run stage1 only, it would be something around: B1=700000 -> 3.00% chance of factor, 2.25GHz-days half the effort, half the chance of factor, and only slightly higher B1 than normal. Quote:
My comment on bounds: let Prime95 do its thing. Its a complex process to select optimal bounds (which I have glossed over above), and setting them manually without a very specific reason almost always results in worse bounds selection than letting Prime95 figure it out. |
|||
|
|
|
|
|
#1295 |
|
Oct 2011
67910 Posts |
Another thing to think about with the bounds, a lot of times a machine that had too little memory will run a P-1 with a higher B1 than would be used by default. I had several B1=B2 exp that I had to run as Pminus1 since primenet will give no credit if my B1< the last run B1 even if I have a B2 that is 10x the old B1.
On a FYI point, the 7M to 8M(+?) range also seems to have been during a possible 'not up to snuff' programming time, since there are a fair number of factors found that should have been caught before. I even found one factor that was missed by both P-1 and TF when originally checked. |
|
|
|
|
|
#1296 |
|
"James Heinrich"
May 2004
ex-Northern Ontario
342710 Posts |
I know the issue exists, what's not clear is whether the problem lies with a) factors being found but not reported correctly; b) factors found and reported but not stored correctly and/or lost over time; c) bad code in Prime95 not finding factors it should. I know I've found a dozen such factors myself.
|
|
|
|
|
|
#1297 | |
|
Aug 2002
Dawn of the Dead
5·47 Posts |
The exponent in question is M8360353.
I'm planning to fire up another batch in a week or so. I'll do a few hundred or so and we shall see the results. There is another observation. M57078733 just started its run and (as normal) the client prints "Using Pentium4 Type-0 FFT length 3M, Pass1=768, Pass2=4K". For the M83xxxxx it used a Core2 FFT (details were not recorded). Of course, if a particular code path is faster, the client should check and use the best one. I'm wondering if anyone else has seen this behavior. Quote:
|
|
|
|
|
|
|
#1298 |
|
"James Heinrich"
May 2004
ex-Northern Ontario
1101011000112 Posts |
That factor is one of those that bcp19 mentioned: that factor should have been found with the original P-1 bounds, and can be found in stage1 with a B1 >= 63059 (or stage 2 with B1>= 4073 and B2 >= 63059). Should've been found, maybe it was, but the record was lost for whatever reason, as mentioned above between me and bcp19.
If you submit your results.txt to the site you'll see how the bounds overlap between the original P-1 and your more recent one that (re-)found the factor. Unfortuantely PrimeNet throws away the data on what bounds were used when a P-1 factor is found so I can't graph that data without the user submitting the results for F-PM1 results. |
|
|
|