![]() |
Unlucky in l̶o̶v̶e̶ factoring?
Since 4-Oct I've completed 195 TF, P1 and ECM runs, totaling over 5,600 ghz-days, with out finding a factor. I'm glad I don't wager in Vegas...
|
[QUOTE=mrh;532860]Since 4-Oct I've completed 195 TF, P1 and ECM runs, totaling over 5,600 ghz-days, with out finding a factor. I'm glad I don't wager in Vegas...[/QUOTE]
Hang in there. My record (and I'm sure I'm not first) is over 500 TF without a factor. However my lifetime average is about 1/100. |
[QUOTE=mrh;532860]Since 4-Oct I've completed 195 TF, P1 and ECM runs, totaling over 5,600 ghz-days, with out finding a factor. I'm glad I don't wager in Vegas...[/QUOTE]
In what exponent range? |
[QUOTE=Gordon;532902]In what exponent range?[/QUOTE]
All over, but mostly from M100m - M200m. I find numbers I like, do some factoring, then a PRP test. Like with [M]200002007[/M] for example. |
[QUOTE=mrh;532860]Since 4-Oct I've completed 195 TF, P1 and ECM runs, totaling over 5,600 ghz-days, with out finding a factor. I'm glad I don't wager in Vegas...[/QUOTE]
All of your factoring luck got sucked up and funneled into your PRP luck! :) [url]https://www.mersenne.org/report_exponent/?exp_lo=3467&full=1[/url] |
[QUOTE=PhilF;532926]All of your factoring luck got sucked up and funneled into your PRP luck! :)
[url]https://www.mersenne.org/report_exponent/?exp_lo=3467&full=1[/url][/QUOTE] True, I shouldn't complain :) |
Broke the streak: [M]133331333[/M] finally :) And such a good looking number, I thought for sure it would be the one...
|
The truth about ECM's. Lifetime, I have found 20 factors in 15,140 attempts. This is 0.13%.
|
[QUOTE=storm5510;533100]The truth about ECM's. Lifetime, I have found 20 factors in 15,140 attempts. This is 0.13%.[/QUOTE]
IMHO … When it comes to ECM a more accurate measurement is factors per curve. When I run ECM I will often edit the worktodo.txt and increase the curves to at least 10; sometimes 50 or more. With that I have an almost 1% success rate. |
[QUOTE=petrw1;533103]IMHO...With that I have an almost 1% success rate.[/QUOTE]
For many years, I used what the server gave me. I really did not study the worktodo format until a few years ago. With this in mind, the vast majority of what I ran was three curves. I still do this with my Linux system. I allow [I]mprime[/I] to run them as presented. I have ran higher curve counts on my Windows system. Most of these were preps for GMP-ECM. I stopped doing this. GMP-ECM's results format is totally not compatible with [I]Primenet[/I]. |
This is beyond believable bad luck
[url]https://www.mersenne.ca/status/tf/0/365/4/90000[/url]
1 full year of progress. 900.0 has over 2,000 exponents TF'd from 71 to 80 with 1 factor reported. We would expect about 200. 900.1 has over 2,000 exponents TF'd from 71 to 80 or 82 with 7 factors reported. We would expect over 200. I don't believe the factors were simply reported prior to 1 year ago because the remaining unfactored count is much too high. |
900.0M - 900.1M
2016-01-15 ~ 2016-12-17 34 / (70t71 388; 70t72 101) 17.6 2016-12-24 ~ 2017-12-17 10 / (70t71 534; 78t79 64) 8.4 2017-12-31 ~ 2018-12-17 3 / (70t71 229; 78t79 29) 3.6 2019-01-07 ~ 2019-12-17 1 / (71t80 1151; 72t80 638) 193.2 It is possible that someones only submit factors without submit no factor result before the year 2016. 900M - 901M 27325 factored, 48676 exponents. ratio 0.5613, no much difference to other ranges. 900.0M - 900.1M 2748 factored, 4854 exponents. ratio 0.5661, no much difference to other ranges. |
[QUOTE=wreck;533151]900.0M - 900.1M
2016-01-15 ~ 2016-12-17 34 / (70t71 388; 70t72 101) 17.6 2016-12-24 ~ 2017-12-17 10 / (70t71 534; 78t79 64) 8.4 2017-12-31 ~ 2018-12-17 3 / (70t71 229; 78t79 29) 3.6 2019-01-07 ~ 2019-12-17 1 / (71t80 1151; 72t80 638) 193.2 It is possible that someones only submit factors without submit no factor result before the year 2016. 900M - 901M 27325 factored, 48676 exponents. ratio 0.5613, no much difference to other ranges. 900.0M - 900.1M 2748 factored, 4854 exponents. ratio 0.5661, no much difference to other ranges.[/QUOTE] The following table shows all 900.x ranges with similar Pct Factors; however .0 and .1 and 9 or 10 TF bits deeper. But 900.3 at only 71 bits is a higher percent than .0 or .1. With each bit (B) expected to produce 1/(B+1) more factors. So roughly by 80 the percent factored should be about 62%. [CODE]Range Exp Fact Unfact Pct TF Bits 900.0M 4,854 2,748 2,106 56.6% 80 900.1M 4,831 2,760 2,071 57.1% 80/82 900.2M 4,856 2,683 2,173 55.3% 71 900.3M 4,878 2,788 2,090 57.2% 71 900.4M 4,946 2,771 2,175 56.0% 71 900.5M 4,830 2,751 2,079 57.0% 71 900.6M 4,870 2,721 2,149 55.9% 72 900.7M 4,857 2,697 2,160 55.5% 71 900.8M 4,855 2,700 2,155 55.6% 71 900.9M 4,899 2,706 2,193 55.2% 71[/CODE] |
[QUOTE=petrw1;533169]
[CODE]Range Exp Fact Unfact Pct TF Bits 900.0M 4,854 2,748 2,106 56.6% 80 900.1M 4,831 2,760 2,071 57.1% 80/82 900.2M 4,856 2,683 2,173 55.3% 71 900.3M 4,878 2,788 2,090 57.2% 71 900.4M 4,946 2,771 2,175 56.0% 71 900.5M 4,830 2,751 2,079 57.0% 71 900.6M 4,870 2,721 2,149 55.9% 72 900.7M 4,857 2,697 2,160 55.5% 71 900.8M 4,855 2,700 2,155 55.6% 71 900.9M 4,899 2,706 2,193 55.2% 71[/CODE][/QUOTE] That is bad. Do the primenet and mersenne.ca admins know? Were those TF ranges done by one person? one piece of software? one piece of hardware? |
However, according to Primenet these are the actual exponent counts (sorted by the maximum TF bitlevel):
900.0M - 900.1M 71: 1080 72: 709 74: 114 75: 2 76: 43 77: 12 78: 3 79: 139 80: 3 86: 1 900.1M - 900.2M 71: 1641 72: 275 74: 82 76: 19 77: 8 78: 23 79: 22 80: 1 So I suspect that the data in the (mersenne.ca) tables may have been corrupted sometime in the past. |
[QUOTE=Anonuser;533177]So I suspect that the data in the (mersenne.ca) tables may have been corrupted sometime in the past.[/QUOTE]
I thought briefly about redoing the factoring until I calculated the work required: Assuming TF to 71 was accurate (and it may NOT be). There are 1996 exponents to take from 71 to 80 --- At about 271 GhzDays per = 540,916 And 970 to take from 71 to 82 --- At about 1088 GhzDays per = 1,055,360 ==== TOTAL about 1.6M GhzDays Almost exactly 1 year for my 2080Ti Closer to 4 years for a 1080Ti. |
[QUOTE=petrw1;533178]I thought briefly about redoing the factoring until I calculated the work required:
Assuming TF to 71 was accurate (and it may NOT be). There are 1996 exponents to take from 71 to 80 --- At about 271 GhzDays per = 540,916 And 970 to take from 71 to 82 --- At about 1088 GhzDays per = 1,055,360 ==== TOTAL about 1.6M GhzDays Almost exactly 1 year for my 2080Ti Closer to 4 years for a 1080Ti.[/QUOTE] That said, it seems that the (factor) success rate in the 900.0M - 900.2M range is fine if we use the bitlevels reported by Primenet as a basis. It is a bit unusual though that 3911 exponents are available for P-1 in the 900M range. (One possible explanation could be that bogus TF results were submitted in the 900M range some time ago. These results were recognized as bogus and they were subsequently deleted. But unfortunately the incorrect data made it into the mersenne.ca database.) |
[QUOTE=Anonuser;533180]That said, it seems that the (factor) success rate in the 900.0M - 900.2M range is fine if we use the bitlevels reported by Primenet as a basis.
It is a bit unusual though that 3911 exponents are available for P-1 in the 900M range. (One possible explanation could be that bogus TF results were submitted in the 900M range some time ago. These results were recognized as bogus and they were subsequently deleted. But unfortunately the incorrect data made it into the mersenne.ca database.)[/QUOTE] Primenet has the same number of factored/unfactored as Mersenne.ca; however, Primenet does NOT indicate TF bit level. [url]https://www.mersenne.org/primenet/[/url] [url]https://www.mersenne.ca/status/tf/0/0/3/90000[/url] Normally Primenet makes assignments available for P-1 when they are factored to the PrimeNet levels here: [url]https://www.mersenne.org/various/math.php[/url] which is 80 bits for 576M+. Listing them as available for P-1 suggests PrimeNet believes they are factored to 80 bits. That said, PrimeNet also shows 2001 available for P-1 in the 912M range which Mersenne.ca does NOT show as factored to 80 (only 71). |
[QUOTE=petrw1;533182]Listing them as available for P-1 suggests PrimeNet believes they are factored to 80 bits.[/QUOTE]
Only 23 exponents 900M-920M are factored to 80 bits or higher according to primenet: [url]https://www.mersenne.org/report_factoring_effort/?exp_lo=900000000&exp_hi=920000000&bits_lo=80&bits_hi=99[/url] But there are 1365 exponents in the same range factored to between 75 and 79 bits: [url]https://www.mersenne.org/report_factoring_effort/?exp_lo=900000000&exp_hi=920000000&bits_lo=75&bits_hi=79[/url] |
[QUOTE=ATH;533195]Only 23 exponents 900M-920M are factored to 80 bits or higher according to primenet:
[url]https://www.mersenne.org/report_factoring_effort/?exp_lo=900000000&exp_hi=920000000&bits_lo=80&bits_hi=99[/url] But there are 1365 exponents in the same range factored to between 75 and 79 bits: [url]https://www.mersenne.org/report_factoring_effort/?exp_lo=900000000&exp_hi=920000000&bits_lo=75&bits_hi=79[/url][/QUOTE] Ok so it looks like Mersenne.ca has the wrong counts in their table for 900.0 and 900.1 However, its still a mystery why PrimeNet thinks there are 2000 exponents ready for P-1 in 900M and 912M |
[QUOTE=petrw1;533198]However, its still a mystery why PrimeNet thinks there are 2000 exponents ready for P-1 in 900M and 912M[/QUOTE]
Yeah that must be an error because there are only 62 exponents in 912M factored to 72+ bits, but 10000+ factored to 71+ bits, so no bit level corresponds to ~2000 exponents. |
I thought I'd be fancy an run a bunch of PM1 on my old unverified LL exponents, like [M]82503017[/M]. So far about 20 have completed, about 350 GHz-Days each, with no new factors. Not as fancy as I thought....
|
Given the old and new bounds there, the chance of no factor in 20 tries is about half. Not very unlucky, yet. (Take the difference of the two in the mersenne.ca calculator - if the old used Brent-Suyama, approximate its effect by increasing B2 by 10%).
I might as well comment on the last topic though the data used for it are entirely obsolete. The server (then and now) is supposed to mark an exponent 'available for P-1' (and not for TF) when it reaches 1 bit _below_ the standard TF level. This derives from the old practice of CPU factoring running P-1 before the last bit of TF. But if there ever was, there's no mechanism now that assigns that last bit after P-1 is done, so that 1-bit margin really shouldn't exist anymore. |
I average about 10~ish factors for every 1,000 I attempt when trial factoring to 75 bits.
|
That's not very well-defined. As you should know the probability should be the reciprocal of bit-level - accuracy requires using the average of the low and high ends, and correcting for multiple factors, so 1.32% to the nearest 0.01% for 74-75. 10 per 1000 (1%) might be a crude 'eyeball estimate', but it's not your actual odds - it is impossible to be persistently 'unlucky' at this, so if you've made e.g. 20,000 attempts and had exactly 200 factors, your gear probably has an issue (>4 sigma below expected).
|
[QUOTE=Andrew Usher;633268]As you should know the probability should be the reciprocal of bit-level[/QUOTE]That had been a working estimate. The accuracy of that estimate appears to break down somewhere in the 75 bit range. It does approach 1/100. This is based upon large amounts of data across many users.
|
That relationship is [I]extremely[/I] well-grounded mathematically, depending essentially on the prime number theorem. It would require extraordinary data to overturn that belief, such as double-checking a large range with different software and finding nothing. And it certainly should not suddenly change at any given bit level, though if caused by random errors it may appear to - that is the most likely explanation for any persistent shortfall.
|
[QUOTE=Uncwilly;633272]That had been a working estimate. The accuracy of that estimate appears to break down somewhere in the 75 bit range. It does approach 1/100. This is based upon large amounts of data across many users.[/QUOTE]
Do you genuinely see a much smaller number of 75-bit factors compared to, say, 72-bit factors? If so, this would be indicative of a major software bug. |
Chalsall has seen it overall with the various folks using GPU72. IIRC.
[url]https://www.mersenneforum.org/showpost.php?p=529128&postcount=465[/url] |
[QUOTE=Uncwilly;633330]Chalsall has seen it overall with the various folks using GPU72. IIRC.
[url]https://www.mersenneforum.org/showpost.php?p=529128&postcount=465[/url][/QUOTE] Don't see any actual data there, or anything saying the rate of factors dropped. (Disregard the rest of my previous post, of course if TF was missing factors then P-1 would find them) |
Chris has mentioned it other times, this is the only one that I can find (quickly). It is not a sudden drop. But starting around 75 or so, the 1/x ratio begins to fade to be closer to 1/100 by around 79. Those that are great data mavens might be able to produce the data to back this up.
|
[QUOTE=charybdis;633335]Don't see any actual data there, or anything saying the rate of factors dropped. (Disregard the rest of my previous post, of course if TF was missing factors then P-1 would find them)[/QUOTE]
I do *love* the empirical. Corrects all, when appropriate! This report is *very* expensive, so please don't click it unless you can really use the data. But... [URL="https://www.gpu72.com/reports/factor_percentage/"]this is the query against the GPU72 database with regards to factors found *through GPU72*[/URL]. As in, this is a sub-set of what Primenet knows about. I actually haven't looked at this report myself for years. And it doesn't cover ranges above 86M; how time flies when you're having fun! 8^) What *I* see in the data is indeed prior P-1 work has the expected impact on TF success heuristics. So the Theorists win again!!! 9^) |
Hmmm...
That is *really* interesting empirical data.
And, unfortunately, it was written so long ago that it doesn't cover contemporary ranges. We were only going to 75 back then. I don't have time to expand this query. Perhaps over the weekend. Can anyone on the Primenet side of the house build a simple query and expose the data as a CSV et al for those who "Don't Get Out Much"[SUP]TM[/SUP]? |
[QUOTE=chalsall;633356]Can anyone on the Primenet side of the house build a simple query and expose the data as a CSV et al for those who "Don't Get Out Much"[SUP]TM[/SUP]?[/QUOTE]
To be clear, the data point of interest is "what proportion of exponents that have [STRIKE]been fully TF'ed to xx bits[/STRIKE] had TF run at the xx-bit level have a factor with exactly xx bits, [B]found by any method[/B]?" As you mention, if some of the higher TF is done after P-1 has already been run then it's not surprising that there's an apparent drop in TF success rate at these bit levels. |
[QUOTE=charybdis;633358]To be clear, the data point of interest is "what proportion of exponents that have [STRIKE]been fully TF'ed to xx bits[/STRIKE] had TF run at the xx-bit level have a factor with exactly xx bits, [B]found by any method[/B]?"[/QUOTE]
I think the optimal dataset would be all knowledge that might aid in analysis. CSV, TSV, et al. would be fine; it doesn't need to be JSON or XLS et al. This is just a simple 2D matrix. The datasets will then be run through code doing analysis. Great fun. [QUOTE=charybdis;633358]As you mention, if some of the higher TF is done after P-1 has already been run, then it's not surprising that there's an apparent drop in TF success rate at these bit levels.[/QUOTE] It might be interesting to see if there might be an optimization possible if one runs P-1 one level down from optimal TF, as /some/ have claimed. (A call out to LaurV...) |
[QUOTE=chalsall;633363]I think the optimal dataset would be all knowledge that might aid in analysis.[/QUOTE]
I note no one has responded to this. Not that I expected anyone to do so. I am comfortable standing alone. |
As the issue of 'unlucky' factoring (i.e. missed factors) is potentially quite important to the project, I felt it necessary to get some data. For an unbiased and large source, I counted all the exponents with factors in the 76-77 bit level in the range 120-130M. The result was that it falls within statistical expectation, both overall and within each million. The totals, in the current state of TF not quite completed, are 2615 exponents counted and 2584 expected. There is [B]no[/B] shortfall. Chris's (very old) data do not seem to disagree.
Now if he has no better data, I'd have to say it was irresponsible for Uncwilly to spread such a claim. If there [I]were[/I] such a shortfall, it would indeed be cause for investigation and not something to be simply accepted. Although it is only obvious when actually worked out, the two 'corrections' I mentioned to the probability actually cancel, and the chance of at least one factor in the n-bit range is 1/n exactly, and more generally that of at least one between B1 and B2 bits (or digits in any other base) is 1 - B1/B2. Note that the GIMPS 'The Math' page gets it wrong, twice saying the probability between x and x+1 bits is 1/x when it should be 1/(x+1) - but the telescoping product is shown correct. |
[QUOTE=Andrew Usher;633430]Chris's (very old) data do not seem to disagree.[/QUOTE]
I agree. The observations demonstrate that the empirical agrees with the Theorists. And, just to be pedantic, it is Chris' very old data... James suffers the same thing. James' work is quite impressive. Some do not get some humor. I enjoy working with those who do. |
On that point of grammar, there is no such rule as you imply. Every source I've ever read (including those I just found with Google) says that the [I]Chris's[/I] version as at least acceptable, if not preferred, and agree that it is more common in actual use (writing, but speaking surely even more - I've can't remember ever hearing the [I]Chris'[/I] type from anyone, and it would certainly sound odd). Some list exceptions, but none relevant there. And I have always preferred it, in writing and speech - I would even say "Achilles's heel" in the literal sense, while the metaphor preserves the older "Achilles' heel" (and might, I say, be as well written without the apostrophe at all).
Anyway, the count I made is definitive and disproves Uncwilly's strange apparent claim that the TF success rate falls off above 75 in general. |
[QUOTE=Andrew Usher;633519]Anyway, the count I made is definitive and disproves Uncwilly's strange apparent claim that the TF success rate falls off above 75 in general.[/QUOTE]It was not strange. It was based an assertion by someone that had their hands on more data than I did. It was not made just once by them. And it did fit with my much more limited person experience / data set in the 332M range and my factoring luck and that of another user. My found factors per bit level falls off a cliff at the higher levels. I had run over 550 TF assignments in that range with ZERO factors found. It was almost 11 months. That with what was previously stated by someone that makes their living with big data sets caused me to believe that their assertion was likely true.
|
[QUOTE=Uncwilly;633532]That with what was previously stated by someone that makes their living with big data sets caused me to believe that their assertion was likely true.[/QUOTE]
My sincere apologies for misguiding you. What I stated, at the time, I believed to be true. Further, a drill-down on the empirical shows that the theorists were correct. I was incorrect; I was making a conclusion based on a small, and biased, dataset. To put on the table, it is OK to be incorrect. So long as you're willing to admit it once it has been demonstrated to you. This is how we learn. |
OK, I accept your apologies - but now you know to trust the math first, I hope. My time counting those factors wasn't totally wasted as it really does deserve attention to see that we are not missing many factors. It should be of some concern that all GPU TF relies on one code base that is not actively being maintained, and probably has no ability to detect hardware errors.
|
[QUOTE=Andrew Usher;633676]OK, I accept your apologies - but now you know to trust the math first, I hope.[/QUOTE]
Oh... My... Word. That is quite possibly the funniest thing I've read in quite some time. Thank you for that. |
[QUOTE=chalsall;633677]Thank you for that.[/QUOTE]
Just accept it. And move on. Ideally we keep personalities (egos) out this as much as possible. |
[QUOTE=Uncwilly;633678]Just accept it. And move on.[/QUOTE]
Copy. Wilco. |
| All times are UTC. The time now is 13:21. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.