![]() |
[QUOTE=chalsall;633356]Can anyone on the Primenet side of the house build a simple query and expose the data as a CSV et al for those who "Don't Get Out Much"[SUP]TM[/SUP]?[/QUOTE]
To be clear, the data point of interest is "what proportion of exponents that have [STRIKE]been fully TF'ed to xx bits[/STRIKE] had TF run at the xx-bit level have a factor with exactly xx bits, [B]found by any method[/B]?" As you mention, if some of the higher TF is done after P-1 has already been run then it's not surprising that there's an apparent drop in TF success rate at these bit levels. |
[QUOTE=charybdis;633358]To be clear, the data point of interest is "what proportion of exponents that have [STRIKE]been fully TF'ed to xx bits[/STRIKE] had TF run at the xx-bit level have a factor with exactly xx bits, [B]found by any method[/B]?"[/QUOTE]
I think the optimal dataset would be all knowledge that might aid in analysis. CSV, TSV, et al. would be fine; it doesn't need to be JSON or XLS et al. This is just a simple 2D matrix. The datasets will then be run through code doing analysis. Great fun. [QUOTE=charybdis;633358]As you mention, if some of the higher TF is done after P-1 has already been run, then it's not surprising that there's an apparent drop in TF success rate at these bit levels.[/QUOTE] It might be interesting to see if there might be an optimization possible if one runs P-1 one level down from optimal TF, as /some/ have claimed. (A call out to LaurV...) |
[QUOTE=chalsall;633363]I think the optimal dataset would be all knowledge that might aid in analysis.[/QUOTE]
I note no one has responded to this. Not that I expected anyone to do so. I am comfortable standing alone. |
As the issue of 'unlucky' factoring (i.e. missed factors) is potentially quite important to the project, I felt it necessary to get some data. For an unbiased and large source, I counted all the exponents with factors in the 76-77 bit level in the range 120-130M. The result was that it falls within statistical expectation, both overall and within each million. The totals, in the current state of TF not quite completed, are 2615 exponents counted and 2584 expected. There is [B]no[/B] shortfall. Chris's (very old) data do not seem to disagree.
Now if he has no better data, I'd have to say it was irresponsible for Uncwilly to spread such a claim. If there [I]were[/I] such a shortfall, it would indeed be cause for investigation and not something to be simply accepted. Although it is only obvious when actually worked out, the two 'corrections' I mentioned to the probability actually cancel, and the chance of at least one factor in the n-bit range is 1/n exactly, and more generally that of at least one between B1 and B2 bits (or digits in any other base) is 1 - B1/B2. Note that the GIMPS 'The Math' page gets it wrong, twice saying the probability between x and x+1 bits is 1/x when it should be 1/(x+1) - but the telescoping product is shown correct. |
[QUOTE=Andrew Usher;633430]Chris's (very old) data do not seem to disagree.[/QUOTE]
I agree. The observations demonstrate that the empirical agrees with the Theorists. And, just to be pedantic, it is Chris' very old data... James suffers the same thing. James' work is quite impressive. Some do not get some humor. I enjoy working with those who do. |
On that point of grammar, there is no such rule as you imply. Every source I've ever read (including those I just found with Google) says that the [I]Chris's[/I] version as at least acceptable, if not preferred, and agree that it is more common in actual use (writing, but speaking surely even more - I've can't remember ever hearing the [I]Chris'[/I] type from anyone, and it would certainly sound odd). Some list exceptions, but none relevant there. And I have always preferred it, in writing and speech - I would even say "Achilles's heel" in the literal sense, while the metaphor preserves the older "Achilles' heel" (and might, I say, be as well written without the apostrophe at all).
Anyway, the count I made is definitive and disproves Uncwilly's strange apparent claim that the TF success rate falls off above 75 in general. |
[QUOTE=Andrew Usher;633519]Anyway, the count I made is definitive and disproves Uncwilly's strange apparent claim that the TF success rate falls off above 75 in general.[/QUOTE]It was not strange. It was based an assertion by someone that had their hands on more data than I did. It was not made just once by them. And it did fit with my much more limited person experience / data set in the 332M range and my factoring luck and that of another user. My found factors per bit level falls off a cliff at the higher levels. I had run over 550 TF assignments in that range with ZERO factors found. It was almost 11 months. That with what was previously stated by someone that makes their living with big data sets caused me to believe that their assertion was likely true.
|
[QUOTE=Uncwilly;633532]That with what was previously stated by someone that makes their living with big data sets caused me to believe that their assertion was likely true.[/QUOTE]
My sincere apologies for misguiding you. What I stated, at the time, I believed to be true. Further, a drill-down on the empirical shows that the theorists were correct. I was incorrect; I was making a conclusion based on a small, and biased, dataset. To put on the table, it is OK to be incorrect. So long as you're willing to admit it once it has been demonstrated to you. This is how we learn. |
OK, I accept your apologies - but now you know to trust the math first, I hope. My time counting those factors wasn't totally wasted as it really does deserve attention to see that we are not missing many factors. It should be of some concern that all GPU TF relies on one code base that is not actively being maintained, and probably has no ability to detect hardware errors.
|
[QUOTE=Andrew Usher;633676]OK, I accept your apologies - but now you know to trust the math first, I hope.[/QUOTE]
Oh... My... Word. That is quite possibly the funniest thing I've read in quite some time. Thank you for that. |
[QUOTE=chalsall;633677]Thank you for that.[/QUOTE]
Just accept it. And move on. Ideally we keep personalities (egos) out this as much as possible. |
| All times are UTC. The time now is 13:21. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.