![]() |
|
|
#1816 | |
|
"Nathan"
Jul 2008
Maryland, USA
5·223 Posts |
Quote:
Change since January 26, 2015 (63 days ago): 0.500 - 0.467 = 0.033. Change since January 1, 2014 (453 days ago): 0.705 - 0.467 = 0.238. Expected time to next prime in the 79.3M range is therefore estimated by: 63 days/0.033 primes = 1,909 days (from today), or June 20, 2020 OR 453 days/0.238 primes = 1,903 days (from today), or June 14, 2020. Seems that our throughput has been remarkably stable over the last fifteen months or so.
|
|
|
|
|
|
|
#1817 | |
|
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
2×7×19×37 Posts |
Quote:
That is based upon the change in P90 years remaining and the rate of change (based upon a floating period of ~8-13 weeks). |
|
|
|
|
|
|
#1818 | |
|
Serpentine Vermin Jar
Jul 2014
331310 Posts |
Quote:
I think George posted this elsewhere but anyway, the basic thing for first-time LL checks boils down to:
Here's the part I didn't really look at, and that's how the "grace percentage" is calculated. It's somewhat straightforward once I stared at it a while:
The way it calculates, it takes the difference between right now and the date it was assigned, and then subtracts that 365 days. For example, exponent 54357769 was assigned "2013-08-21 02:17:38.490" which was 595 days ago, or 230 days once we grant that one year bonus. Take 230 days and divide by 3.33 and it would expect you to be at least 69.1% done by then, but add another 10% to that for an expectation that after a grand total of 595 days, you *should* be at 79.1% complete. That assignment is, in fact, 97.7% done so yay, it gets a reprieve. But pretend for a moment that it does still crawl along at mind-numbingly slow speeds. At some point the "grace percentage" actually goes over 100% so even if it were at 99.9% done it would expire. That will happen when an exponent is over 665 days old: (665-365)/3.33 + 10 = 100.1. When that happens (as of today, that's any assignment earlier than June 12, 2013), as long as it's below that critical threshold it will expire. In the example of M54357769 that will happen in 70 more days from today. There are currently 14 first-time, grandfathered assignments in that critical range. It's hard to say if any of them are in particular danger of expiring since I don't know how fast they're actually progressing. The closest is M58403539 which is just 7.28% ahead of the cut-off. Grand total, there are 3,294 grandfathered first-time checks. 2,622 of them are above 100,000,000 which are well ahead of the critical area and will be safe for some time to come. Most of the other 672 are in the 60-70M range with only 104 below 60M. So... that makes me feel better, knowing some of the slower systems out there can't actually hold things up indefinitely. They will eventually expire. |
|
|
|
|
|
|
#1819 | |
|
Undefined
"The unspeakable one"
Jun 2006
My evil lair
185216 Posts |
Quote:
|
|
|
|
|
|
|
#1820 | |
|
Serpentine Vermin Jar
Jul 2014
63618 Posts |
Quote:
Okay: (# of days / 3.33) implies it's expecting 0.3% daily. Anyway, besides that, the rest of the math should be correct. If a grandfathered exponent reaches the ripe old age of 665 days old, it will expire no matter what (assuming it's below the "critical" threshold, which has it's own algorithm). That's more generous than I probably would be with exponents in that critical range.
|
|
|
|
|
|
|
#1821 |
|
Serpentine Vermin Jar
Jul 2014
3,313 Posts |
Incidentally, I'm trying to think of something that the server could be doing to not only track the current % done, but also gather some kind of velocity. It may be somewhat basic to keep it simple, like just doing a delta of the last check-in and the current one. Anything fancier that tracks it more thoroughly would give better results but also be a database burden and be kind of a chore.
Basically, the database is big enough as-is without trying to keep track of the % done and timestamps for every time a client updates itself. It would be interesting though to have some basic stats, because the current ETA based on the client's "best guess" can be wildly optimistic, to say the least. It's not high on my to-do list but maybe when I have some spare time I can noodle around with some ideas... if anyone has any thoughts on that, let me know. |
|
|
|
|
|
#1822 | |
|
Undefined
"The unspeakable one"
Jun 2006
My evil lair
2×11×283 Posts |
Quote:
Last fiddled with by retina on 2015-04-09 at 02:15 |
|
|
|
|
|
|
#1823 |
|
Jun 2003
32·5·113 Posts |
So only do it for exponents of interest, say the lowest 100 (or earliest 100) active assignments (LL / DC). Dump them into a separate table, and do a linear regression to get more reliable ETAs. Once the assignment is over, clean out the table.
|
|
|
|
|
|
#1824 | |
|
Serpentine Vermin Jar
Jul 2014
3,313 Posts |
Quote:
My best guess at an approach right now would be to setup a whole new table in the DB that run some scheduled job that takes the current dates and progress and stores them. Then the website could examine that in whatever way it wants to get some idea of the real progress going on. To your point, it could indeed be relegated to just the stuff that might show up in whatever milestone reports we're interested in at the time. If that were the case, it could track maybe a couple weeks worth of check-ins for a subset of work and it wouldn't be too bad to manage. Well, it's a thought for sure. First step is collecting that data which really isn't too hard. Next would be doing something with it.
|
|
|
|
|
|
|
#1825 | |
|
Einyen
Dec 2003
Denmark
1100010101102 Posts |
Quote:
http://www.mersenneforum.org/showpos...postcount=1548 I actually meant the 4th last checkin in that formula so we get 3 gaps between 4 checkins, but 3 is just a suggestion maybe another number of gaps is better. Last fiddled with by ATH on 2015-04-09 at 10:27 |
|
|
|
|
|
|
#1826 | |
|
Serpentine Vermin Jar
Jul 2014
3,313 Posts |
Quote:
Whereas first-time checks assume the work will be at least 10% done after the first year, double-checks assume they'll be at least 60% done in the first year. First time assumes 0.3% progress daily, and double-checks assume 0.333% daily. What it boils down to for double-checks is that it will expire no matter what, even at 99.99%, once the exponent reaches an age of 485 days (~ 1 year+4 months, compared to ~ 1 year+10 months for first time checks). Just like first time grandfathered assignments, this only applies to work in the critical range. Right now that means exponents below 34505378. As it turns out, there aren't any grandfathered DC assignments below that anyway. There are just 69 of them right now...
I can say that of the 4 grandfathered assignments in the 35M range, once the critical threshold reaches them, they would all get expired right away. Some of them aren't even really close, like 16-20%. I mean, *maybe* by the time 35040547 is in the critical area it will have moved past the expiration threshold, but I guess we'll see. As of today, that threshold is 78% for that exponent, and it's only 16.4% done. I'm not sure when to expect that one to be in the critical range based on the current progress, but it has some catching up to do. |
|
|
|
|
![]() |
| Thread Tools | |
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Newer X64 build needed | Googulator | Msieve | 73 | 2020-08-30 07:47 |
| Performance of cuda-ecm on newer hardware? | fivemack | GMP-ECM | 14 | 2015-02-12 20:10 |
| Cause this don't belong in the milestone thread | bcp19 | Data | 30 | 2012-09-08 15:09 |
| Newer msieves are slow on Core i7 | mklasson | Msieve | 9 | 2009-02-18 12:58 |
| Use of large memory pages possible with newer linux kernels | Dresdenboy | Software | 3 | 2003-12-08 14:47 |