mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Data (https://www.mersenneforum.org/forumdisplay.php?f=21)
-   -   Newer milestone thread (https://www.mersenneforum.org/showthread.php?t=13871)

Dubslow 2016-10-24 11:30

I should have mine completed within a week or so. It's a side effect (not a direct consequence of) a hardware failure from some months ago that hopefully will have said side effect fixed within the next 24 hours.

At any rate I have 22 days before expiration, and although it's holding up the milestone, the delay is minimal. Just give me my allotted time please :smile:

Madpoo 2016-10-25 15:03

[QUOTE=Dubslow;445661]I should have mine completed within a week or so. It's a side effect (not a direct consequence of) a hardware failure from some months ago that hopefully will have said side effect fixed within the next 24 hours.

At any rate I have 22 days before expiration, and although it's holding up the milestone, the delay is minimal. Just give me my allotted time please :smile:[/QUOTE]

If you checked in the current progress it would help keep any potential poachers at bay.

I've been eyeballing those stragglers and were it not for the fact that I know you from the forums here, I may have concluded that they were abandoned since the last check-in date was nearly a month ago.

(you may have updated them since I last checked a day or two ago...)

Dubslow 2016-10-25 15:11

The computer wasn't running for the last few weeks. It's been running again since yesterday (ETA ~5 days), and if it had gone longer without running I would have said something/requested an extension.

petrw1 2016-10-28 18:36

GPU to 72
 
[url]http://www.gpu72.com/reports/workers/[/url]

More than 50 Million GhzDays Work Done
More than 20 Million GhzDays Work Saved
- This is pretty close to the total LL done in the last 365 days.
- Though saved can be P-1, DC and LL

LaurV 2016-10-28 18:41

Interesting, we are still in position 11 for GHzDays contributed, and the 5th for number of factors, even if we didn't do any TF work for more than a year. :w00t:

edit: we will resume soon, in few weeks, with a new machine which we are currently building.

chalsall 2016-10-28 20:51

I'm not quite sure what to do with this...
 
[URL="https://www.youtube.com/watch?v=v0nmHymgM7Y"]I'm ready my lord....[/URL]

proxy2222 2016-11-01 21:09

[QUOTE=srow7;445643][URL="http://www.mersenne.org/assignments?exp_lo=37000000&exp_hi=38000000&execm=1&exp1=1"]www.mersenne.org/assignments?exp_lo=37000000&exp_hi=38000000&execm=1&exp1=1[/URL]
DC 38M milestone being held up by abandoned assignments from
dubslow and kracker
can you guys complete them or unreserve them
thanks[/QUOTE]

DC 38M is done :wacky:

lycorn 2016-11-02 00:36

We have just (31/10) factored 29M exponents out of the 50847534 < 1B.

[URL="http://www.mersenne.ca/status/tf"]Better provide evidence to keep the Bulls away...[/URL] :whistle:

[URL="http://www.mersenneforum.org/showpost.php?p=446149&postcount=2677"]Although sometimes it might not suffice...[/URL]

57.035% done. :bangheadonwall:

Madpoo 2016-11-03 01:12

[QUOTE=proxy2222;446209]DC 38M is done :wacky:[/QUOTE]

Cool... I missed this yesterday but it looks like George (or James?) made the updates on the milestone page. Thanks! :smile:

cuBerBruce 2016-11-19 18:56

[QUOTE]All exponents below 69,000,000 tested at least once.[/QUOTE]
:tu:

tha 2016-12-16 11:07

On the [URL="http://www.mersenne.org/primenet/"]primenet status page[/URL] in the column 'Composite LL-D', the remaining number of exponents which have been double checked but not yet factored is about to drop below 22.000 per every range of 1 million.

The number of exponents that have been factored in the range from 0 up to the lowest exponent that has not yet been double checked, currently at about 38M, has been pretty impressive. Additional trial factoring, P-1 factoring and ECM decreases the number of exponents without known factors substantially.

NBtarheel_33 2016-12-17 09:58

[QUOTE=petrw1;445923][URL]http://www.gpu72.com/reports/workers/[/URL]

More than 50 Million GhzDays Work Done
More than 20 Million GhzDays Work Saved
- This is pretty close to the total LL done in the last 365 days.
- Though saved can be P-1, DC and LL[/QUOTE]

The idea being, of course, that the 50 million GHz-days did not really cost 50 million GHz-days, due to the speed differential between GPUs and CPUs. IIRC, the formula to compare apples to apples, so to speak, was to effectively divide the number of GPU GHz-days by 100/3 = ~33.3 to get an idea of the equivalent CPU GHz-days.

Thus GPU72 has saved more than 20 million GHz-days of work at an equivalent cost of roughly 50/33.3 = 1.5 million GHz-days. Now *that* is an impressive return on investment!

axn 2016-12-17 13:11

[QUOTE=NBtarheel_33;449339]IIRC, the formula to compare apples to apples, so to speak, was to effectively divide the number of GPU GHz-days by 100/3 = ~33.3 to get an idea of the equivalent CPU GHz-days.

Thus GPU72 has saved more than 20 million GHz-days of work at an equivalent cost of roughly 50/33.3 = 1.5 million GHz-days. Now *that* is an impressive return on investment![/QUOTE]

Looking at GPU productivity figures for TF vs LL, the top of the line GPUs show a ratio of TF Gd/d vs LL Gd/d of somewhere between 13-18.5, so an average ratio of 16 might be more accurate. That'd still mean only 50/16 ~= 3 million expended for a gain of 20mil - impressive, as you say. However, this might be skewed if there are lots of shallow-but-wide searches, as it will inflate the ROI.

srow7 2016-12-17 20:23

70M milestone
 
[url]www.mersenne.org/assignments?exp_lo=67000000&exp_hi=70000000&execm=1&exdchk=1[/url]

70M milestone being held up by 2 abandoned assignments.
Are we really going to wait 26 days for it to expire ?

Mark Rose 2016-12-17 23:17

[QUOTE=srow7;449382][url]www.mersenne.org/assignments?exp_lo=67000000&exp_hi=70000000&execm=1&exdchk=1[/url]

70M milestone being held up by 2 abandoned assignments.
Are we really going to wait 26 days for it to expire ?[/QUOTE]

No doubt anonymous will poach them and a few forumers have already done checks and they are not prime.

rudy235 2016-12-18 00:29

Why do they have to be poached? It should have become apparent a few weeks ago that they were in the high nineties of advance and going nowhere.

When there is a negative ETA for more than a full week they should be recycled.

IMHO

retina 2016-12-18 02:30

[QUOTE=srow7;449382]Are we really going to wait 26 days for it to expire ?[/QUOTE]Sure, why not? Why so impatient?[QUOTE=rudy235;449394]Why do they have to be poached? It should have become apparent a few weeks ago that they were in the high nineties of advance and going nowhere.

When there is a negative ETA for more than a full week they should be recycled.[/QUOTE]Then primenet breaks its promise of time granted to complete. If you make a promise then keep it, else you look like an a:censored:e. We have no idea of the status of the systems running those numbers.

proxy2222 2016-12-18 15:32

[QUOTE=srow7;449382][URL="http://www.mersenne.org/assignments?exp_lo=67000000&exp_hi=70000000&execm=1&exdchk=1"]www.mersenne.org/assignments?exp_lo=67000000&exp_hi=70000000&execm=1&exdchk=1[/URL]

70M milestone being held up by 2 abandoned assignments.
Are we really going to wait 26 days for it to expire ?[/QUOTE]

The "Expires (days)" column is incorrect. They will expire next tuesday (2016-12-20) because they have not reported in for 30 days and will get reassigned to somebody else.

rudy235 2016-12-18 18:13

[QUOTE=proxy2222;449435]The "Expires (days)" column is incorrect. They will expire next tuesday (2016-12-20) because they have not reported in for 30 days and will get reassigned to somebody else.[/QUOTE]

Thanks!
I think that promises should be kept. I do feel however that there is an implied promise on the part of those who take an assignment. That one needs to be kept too.

petrw1 2016-12-19 15:20

[QUOTE=lycorn;445257]Another milestone I'm looking forward to is "all numbers with exponents less than 1B trial factored to at least 63 bits". We are getting there, slowly but surely.[/QUOTE]

And it's done....as of December 19

petrw1 2016-12-19 21:14

[QUOTE=proxy2222;449435]The "Expires (days)" column is incorrect. They will expire next tuesday (2016-12-20) because they have not reported in for 30 days and will get reassigned to somebody else.[/QUOTE]

From the Assignments Rules page:

[QUOTE]Must be completed in 30 days
Assignments are recycled if assignment is not started with 7 days or when assignment is more than 30 days old.[/QUOTE]

These are both over 60 days old.

proxy2222 2016-12-19 21:55

[QUOTE=petrw1;449556]From the Assignments Rules page:

These are both over 60 days old.[/QUOTE]

Yes but the exponent category limits changes daily so you have to look at the assignment rules for the date the exponents were assigned. So have a look at:

[URL="http://www.mersenne.org/M69827357"]M69827357[/URL] [URL="http://www.mersenne.org/thresholds/?dt=2016-10-05"]Assignment rules 2016-10-05[/URL]
[URL="http://www.mersenne.org/M69946859"]M69946859 [/URL][URL="http://www.mersenne.org/thresholds/?dt=2016-10-14"]Assignment rules 2016-10-14[/URL]

You will notice that they were both Cat 1 in october and - at that time - had 90 days to complete.

Madpoo 2016-12-19 23:16

[QUOTE=proxy2222;449559]Yes but the exponent category limits changes daily so you have to look at the assignment rules for the date the exponents were assigned. So have a look at:

[URL="http://www.mersenne.org/M69827357"]M69827357[/URL] [URL="http://www.mersenne.org/thresholds/?dt=2016-10-05"]Assignment rules 2016-10-05[/URL]
[URL="http://www.mersenne.org/M69946859"]M69946859 [/URL][URL="http://www.mersenne.org/thresholds/?dt=2016-10-14"]Assignment rules 2016-10-14[/URL]

You will notice that they were both Cat 1 in october and - at that time - had 90 days to complete.[/QUOTE]

Yeah, as far as I know, the "expires in" column on that report should have the correct #. It hasn't been wrong yet.

I looked at the actual expiration process (and also the rules spelled out on the thresholds page) to reverse engineer and figure out when an exponent would expire, based on the current status (whether it had started or not).

As you point out, it also depends on what category it was when first assigned, to a degree.

Negative "time to complete" just means it hasn't updated progress since the last time, and it's previous ETA has come and gone.

I'd been eyeing those two slowpokes, thinking about poaching them myself, but, meh... someone else might anyway.

Plus, if they are allowed to expire, it means those machines will be blocked from getting any future low category assignments for a while. If someone poaches them, they convert to double-checks and haven't technically expired.

Of course their lack of progress means they wouldn't get low category work anyway, but I thought I'd mention that aspect of it.

proxy2222 2016-12-21 09:10

[QUOTE=Madpoo;449566]Yeah, as far as I know, the "expires in" column on that report should have the correct #. It hasn't been wrong yet.

I looked at the actual expiration process (and also the rules spelled out on the thresholds page) to reverse engineer and figure out when an exponent would expire, based on the current status (whether it had started or not).
[/QUOTE]

Ok, then please explain me why they expired yesterday while they still showed 15 and 24 days before expiration :smile:

I still believe it is a bug because yesterday it should have displayed 0 days instead of 15 and 24 days.

I find it more logically to display the days to normal expiration or if it has not reported in for more than 2 days, display the number of days left before expiration due to not reporting in. Actually, display the Min(daysToNormalExpiry ,daysToExpiryBecauseNotReportedIn)).

Something like this (I'm leaving out the "not started within xx days" parameter for simplicity reasons):

[CODE]

const int DAYS_BEFORE_EXPIRY= 90;
const int DAYS_TO_REPORT_IN = 30;

// Example: 2016-12-20 - 2016-10-24 = 58 days -> 90 - 58 = 32 days
var daysToNormalExpiry = DAYS_BEFORE_EXPIRY - (DateTime.Now - beginDate).TotalDays;

// Example: 2016-12-20 - 2016-11-20 = 30 days -> 30 - 30 = 0 days
var daysToExpiryBecauseNotReportedIn = daysToNormalExpiry;
// Only change this variable if the user has not reported in for at least 2 days
if ((DateTime.Now - dateLastUpdate).TotalDays > 2)
daysToExpiryBecauseNotReportedIn = DAYS_TO_REPORT_IN - (DateTime.Now - dateLastUpdate).TotalDays;

// Return
return Math.Min(daysToNormalExpiry , daysToExpiryBecauseNotReportedIn);[/CODE]

I don't know how everything works but if I understand correctly the calculation on the web page and the calculation on the actual expiration process is not the same. Why not:

1) Add a field "expirationdate" on the exponents table
2) Write a batch program that updates the expirationdate field once a day
3) Perform actual expiration based on the expirationdate field (select * from exponents where expirationdate >= getdate())
4) Display the expirationdate field in the "Expires (days") column

This way, the business logic is in one place and it will improve the performance of your database server as I suppose the current calculation is done in stored procedures.

lycorn 2016-12-21 15:41

[QUOTE=petrw1;449529]And it's done....as of December 19[/QUOTE]

Yep. I noticed the long queue of numbers that extends to the left in the first line of [URL="http://www.mersenne.ca/status/"]This Report[/URL] has decreased by one unit of length :smile:.
Well done! Only 1125 left to clear the 63-bit level... :whistle:

petrw1 2016-12-21 17:19

[QUOTE=lycorn;449675]Yep. I noticed the long queue of numbers that extends to the left in the first line of [URL="http://www.mersenne.ca/status/"]This Report[/URL] has decreased by one unit of length :smile:.
Well done! Only 1125 left to clear the 63-bit level... :whistle:[/QUOTE]

Your turn?

:smile:

Madpoo 2016-12-21 19:03

[QUOTE=proxy2222;449659]Ok, then please explain me why they expired yesterday while they still showed 15 and 24 days before expiration :smile:[/QUOTE]

Ah crud. I guess my figurin' needs some figurin'.

[QUOTE]I find it more logically to display the days to normal expiration or if it has not reported in for more than 2 days...[/QUOTE]

I think what I'm missing is that at some point since I added the prediction for "will expire in" info, the rules changed some (thresholds were changed a bit, etc). I think I missed the part about expiring if they haven't reported in for XX days if the assignment is > YY days old. I need to add that part in, otherwise I think I'm good.

The deal is, the expiration of assignments is a job that runs and looks at things as they exist at that moment. There's nothing in place that predicts when assignments will expire. And although that is deterministic based on current state, I still had to figure it out and work it into a SQL function.

Anyway... maybe something to work on over my Christmas vacation. And anyways, now those other two of the 4 remaining exponents have stalled out... hopefully the person running both of those is still going to report in... they are SO CLOSE to being done.

lycorn 2016-12-21 19:22

[QUOTE=petrw1;449696]Your turn?

:smile:[/QUOTE]

Unfortunately, I don't think I'll have the means to join any TF effort in the near future...:sad:
I will do my best in the ECM front, but even that won't be much.

Madpoo 2016-12-21 20:03

[QUOTE=Madpoo;449707]I think what I'm missing is that at some point since I added the prediction for "will expire in" info, the rules changed some (thresholds were changed a bit, etc). I think I missed the part about expiring if they haven't reported in for XX days if the assignment is > YY days old. I need to add that part in, otherwise I think I'm good.[/QUOTE]

I'm having some trouble figuring out exactly how I should go about adding a predicted expiration time to account for machines that haven't reported in.

Most machines will update their work daily... considering a first time cat 1 assignment has 90 days to finish, I can't just "predict" the expiration will always be 29 days out, since that's 30 - (days since last update which is usually 1).

My best guess right now is that at some point, maybe after they haven't checked in for 3 weeks, then I could start an expiration countdown based on that fact alone?

Right now for instance, the days-to-expire for new assignments that haven't started yet have a much lower # since they only get XX days before they *must* start some work. But once they do start, the days-to-expire is extended out to reflect their new status.

This is kind of a reverse situation where the days-to-expire will be longer, and then at some point, when they haven't checked in for a while, all of a sudden the days-to-expire would plummet to however many days.

It's a pickle... I won't do anything for now and wait for feedback. Dunno if there's some option I haven't thought of. I don't want to have two different days-to-expire on that report and have to add something to explain each and have to totally muck with the function to return an array of results or something... it has to perform well and look nice. :smile:

If I had to pick right now, I'd just pick an arbitrary value, like they haven't reported in for 20 days or whatever, and *then* start counting down using that clause. :judge:

proxy2222 2016-12-21 20:53

[QUOTE=Madpoo;449713]

....

Right now for instance, the days-to-expire for new assignments that haven't started yet have a much lower # since they only get XX days before they *must* start some work. But once they do start, the days-to-expire is extended out to reflect their new status.

This is kind of a reverse situation where the days-to-expire will be longer, and then at some point, when they haven't checked in for a while, all of a sudden the days-to-expire would plummet to however many days.

It's a pickle... I won't do anything for now and wait for feedback. Dunno if there's some option I haven't thought of. I don't want to have two different days-to-expire on that report and have to add something to explain each and have to totally muck with the function to return an array of results or something... it has to perform well and look nice. :smile:

If I had to pick right now, I'd just pick an arbitrary value, like they haven't reported in for 20 days or whatever, and *then* start counting down using that clause. :judge:[/QUOTE]

Two different dates would be very confusing. I'd say you start the countdown halfway at 15 days. The user will be notified earlier and still have another 15 days to report in.

As you already pointed out, the consequence will be that the number of days can skyrocket (when they report in again) or plummet (not reporting in) but this is already the case when a new exponent is reporting in for the first time.

Also, do not forget to always check the normal expiration date. If the exponent is 80 days old (cat 1) and the user has not reported in for 15 days, the number of days should then be 10 and decrease - regardless if he reports back in or not.

chalsall 2016-12-21 20:58

[QUOTE=Madpoo;449713]I'm having some trouble figuring out exactly how I should go about adding a predicted expiration time to account for machines that haven't reported in.[/QUOTE]

What about a simple min() function across all the clauses?

[URL="http://www.mersenne.org/assignments/?exp_lo=70247057&exp_hi=&execm=1&exdchk=1&exp1=1&extf=1"]70247057 is a good example.[/URL] It was probably a Cat 1 when assigned -- thus it has to report in at least once every 30 days.

I understand that this is a "front end vs. back end" issue. The right hand doesn't know what the left hand is doing. But it confuses the users.

Madpoo 2016-12-22 01:23

[QUOTE=chalsall;449720]What about a simple min() function across all the clauses?[/QUOTE]

Maybe... right now it's a CASE/WHEN and it's pretty straightforward.

To use min() I'd need to reorganize it, plus it wouldn't perform as well because it would have to figure out the expiration in *all* cases instead of doing them in order with a fall-through condition.

Right now I can't just add another when/then to the CASE because the expiration for "hasn't checked in for XX days" is going to be different than the expiration for other reasons.

Not insurmountable, I just need to re-think how it's being done and make sure however it's done is still performant. In the end, I might just have to do as you say, figure out the expiration for each case and pick whichever one would be first. In the case of the "hasn't reported in for xx days" I'd also make sure it only returns a value at all if it's been more than 15-20 days or whatever, otherwise something that checked in the day before will say "29 days to expire" based on that clause alone. :smile:

Actually it may be easier to use my existing code and then also figure out the expiration based on "last update if it's been >= 20 days" and figure out which one is smaller just between those two things.

Ugh... It'd be ugly but should still work fast enough to show days-to-expire for lists of many assignments at once. It's one of those things that you think 'it's fast enough' but when you iterate it hundreds of times you realize just how bad it is. :smile:

axn 2016-12-22 03:24

[QUOTE=Madpoo;449728]In the case of the "hasn't reported in for xx days" I'd also make sure it only returns a value at all if it's been more than 15-20 days or whatever, otherwise something that checked in the day before will say "29 days to expire" based on that clause alone. :smile:[/QUOTE]

I'd suggest something smaller like 10 days, which would be a good indicator that something has f'ed up, and therefore we should start the count down.

Madpoo 2016-12-22 04:03

[QUOTE=axn;449733]I'd suggest something smaller like 10 days, which would be a good indicator that something has f'ed up, and therefore we should start the count down.[/QUOTE]

Meh... I mocked it up and went with 20 days as the threshold for using that expiration (if lower than any other clause).

You can see it in action here:
[URL="http://www.mersenne.org/assignments/?exp_lo=70312159"]Old version[/URL]
[URL="http://www.mersenne.org/assignments/default.mock.php?exp_lo=70312159"]New version[/URL]

chalsall 2016-12-22 17:51

[QUOTE=Madpoo;449736]You can see it in action here:
[URL="http://www.mersenne.org/assignments/?exp_lo=70312159"]Old version[/URL]
[URL="http://www.mersenne.org/assignments/default.mock.php?exp_lo=70312159"]New version[/URL][/QUOTE]

Looks good.

And not trying to tell you how to chew gum, but this is a perfect situation for caching calculations / knowledge. To the best of my understanding Primenet's back-end only does these calculations once per day (somewhere around midnight UTC) during the expiry process.

Leverage on that to update a table which your front-end can use. Separately, when a client reports in update the related records, or mark the records as "dirty" and update them using a low priority process.

Thus your SELECT statement would only be a min() across previously calculated values. Should scale well for large queries.

petrw1 2016-12-22 20:31

[QUOTE=Madpoo;449566]
I'd been eyeing those two slowpokes, thinking about poaching them myself, but, meh... someone else might anyway.
[/QUOTE]

Could a requirement to get Cat 0/1 be that you need to provide an e-Mail address (assumed valid) so that when they stop reporting (especially so close to the end) they could be NUDGED?

No response with a reasonably long time frame might be hint.

But more accurately you might get a response that says "OOPS Sorry, I moved. PC Died. I quit.,, etc." Then you would know if they are truly abandoned.

chalsall 2016-12-23 00:11

[QUOTE=petrw1;449761]But more accurately you might get a response that says "OOPS Sorry, I moved. PC Died. I quit.,, etc." Then you would know if they are truly abandoned.[/QUOTE]

Never send a human to do a machines' job.

Madpoo 2016-12-23 15:58

[QUOTE=chalsall;449754]Looks good.

And not trying to tell you how to chew gum, but this is a perfect situation for caching calculations / knowledge. To the best of my understanding Primenet's back-end only does these calculations once per day (somewhere around midnight UTC) during the expiry process.

Leverage on that to update a table which your front-end can use. Separately, when a client reports in update the related records, or mark the records as "dirty" and update them using a low priority process.

Thus your SELECT statement would only be a min() across previously calculated values. Should scale well for large queries.[/QUOTE]

Personally, I like having the real-time "days to expire" especially for the case when a machine checked in for the first time after getting the assignment.

There's no nightly task that stores the expire-time for each assignment, it just looks at each one and makes the decision right there "keep or toss", just a binary decision.

Pre-computing the stats could be done if it needed to be, and maybe something more often than daily, but we'll see how it goes. I just timed the current function with my new one that includes the extra step of checking the "last checked in" clause, and for 1000 rows it takes an extra 2 seconds. Oh, 2 seconds you say... that's not bad. Well, it's the difference between 5 seconds and 7 seconds total, so... 40% slower. It also made me think my original function with CASE statements is still slow. (EDIT: It takes 1 second to return that many rows without including any expiry info at all).

In fact, using a function over a lot of rows has apparently always been a performance issue, at least in MSSQL. I didn't get all the technical details, but something about having poor execution plans in that case, or something.

Well, I may take an extra gander at the whole thing and see if I can't get this running any faster, otherwise I may go with plan B of precomputing the whole thing in a sproc, where I can use inline TSQL instead of a function for faster runs over large # of rows.

chalsall 2016-12-23 19:35

[QUOTE=Madpoo;449802]There's no nightly task that stores the expire-time for each assignment, it just looks at each one and makes the decision right there "keep or toss", just a binary decision.[/QUOTE]

A perfect opportunity to cache knowledge. An INSERT on duplicate UPDATE statement once a day into a table is probably less expensive than all the queries asking about the lowest "n" candidates throughout the day.

And I hear you about wanting "real-time" results. This could be achieved by updating the same table's related records and fields every time a machine checks in.

Entirely up to you of course. :smile:

Madpoo 2016-12-26 20:19

Those slow-moving exponents in the 69M range (that will expire imminently)... I just looked back at their full histories (both by the same user).

It's weird... sometimes that user will check in a status showing 4.5% increase from the previous day, but then it could go weeks (checking in daily) with zero progress. Then it'll update and show a marginal (0.1 %) increase from the day before, then weeks again of nothing, and another 4.5% jump out of nowhere.

Very strange. It must not be running full time, or there are some other CPU hungry things running (a LOT) that keep Prime95 from doing any work, but it's still running enough to check in and say "nothing changed".

Anyway, as of right now, those two from that user (69110411 and 69110441) are [I]saying[/I] they'll be done in in 0-1 days, and they'll expire in 0-3 days. But when I look at their average daily progress, they're projected to finish in 10.6 and 1.8 days (respectively).

In other words, both of them will expire before they're projected to finish, but like I mention, their progress has been so sporadic, their average daily rate of progress is misleading when it comes to that final stretch. They might or might not have a burst of energy and finish in time.

It's sometimes strange to peek at the daily progress and see things like that. I don't know what's happening that it's still checking in daily for weeks (sometimes months!) at a time without the needle moving a single blip, but it happens, followed by odd bursts of "getting stuff done".

rudy235 2016-12-26 22:08

[QUOTE=Madpoo;449957]Those slow-moving exponents in the 69M range (that will expire imminently)... I just looked back at their full histories (both by the same user).

It's weird... sometimes that user will check in a status showing 4.5% increase from the previous day, but then it could go weeks (checking in daily) with zero progress. Then it'll update and show a marginal (0.1 %) increase from the day before, then weeks again of nothing, and another 4.5 %...

[stuff deleted]

It's sometimes strange to peek at the daily progress and see things like that. I don't know what's happening that it's still checking in daily for weeks (sometimes months!) at a time without the needle moving a single blip, but it happens, followed by odd bursts of "getting stuff done".[/QUOTE]

Seems clear to me that if by 0 days to expire it means before 00 hours 2016 12 27 that 691110441 will expire. The other 691110411 might finish in time but does not seem probable. It is indeed a pity that being so close to 100% they fail to produce, but hopefully they might qualify for the 2nd LL verification.

proxy2222 2016-12-26 23:51

[QUOTE=rudy235;449960]Seems clear to me that if by 0 days to expire it means before 00 hours 2016 12 27 that 691110441 will expire. The other 691110411 might finish in time but does not seem probable. It is indeed a pity that being so close to 100% they fail to produce, but hopefully they might qualify for the 2nd LL verification.[/QUOTE]

FYI: I'm poaching 69110411, ETA: 35 hr. 69110441 just expired so if it gets assigned to a fast machine we should reach 70M milestone before 2017 :-)

retina 2016-12-27 00:10

[QUOTE=proxy2222;449963]FYI: I'm poaching 69110411 ,,,[/QUOTE]So impatient.

rudy235 2016-12-27 02:42

[QUOTE=retina;449964]So impatient.[/QUOTE]

...let me remind you also that moderation in the pursuit of justice is no virtue.:smile::smile:

Barry Goldwater.

ATH is taking care of 69110411

proxy2222 2016-12-29 09:13

70M done
 
[QUOTE=proxy2222;449963]FYI: I'm poaching 69110411, ETA: 35 hr. 69110441 just expired so if it gets assigned to a fast machine we should reach 70M milestone before 2017 :-)[/QUOTE]

70M is done! :banana:

NBtarheel_33 2016-12-29 11:03

[QUOTE=proxy2222;450088]70M is done! :banana:[/QUOTE]

A mere 369 days after reaching the milestone of first-time testing all exponents below 60M, by far the fastest time between "decennial" milestones.

The new assignment rules are working extremely well, I'd say. :smile:

NBtarheel_33 2016-12-29 11:08

Suggest that we add to the milestones report a countdown to completing the first objective of "classical" GIMPS (per the [URL=http://www.mersenne.org/report_classic]"colorful stats report"[/URL]): first-time testing all exponents below 79,300,000.

Madpoo 2016-12-29 23:31

[QUOTE=proxy2222;450088]70M is done![/QUOTE]

Milestone page updated.

petrw1 2016-12-30 22:51

[QUOTE=NBtarheel_33;450094]Suggest that we add to the milestones report a countdown to completing the first objective of "classical" GIMPS (per the [URL=http://www.mersenne.org/report_classic]"colorful stats report"[/URL]): first-time testing all exponents below 79,300,000.[/QUOTE]

[CODE]
Oct 25, 1999

Year Pr.# Factored Two LL One LL Unknown Expect P-90 Yrs
1999 38 1,622,684 102,184 95,493 2,810,514 6.08 43,144,037

Today

2016 49 3,004,289 910,033 641,519 75,033 0.16 2,599,832
[/CODE]

rudy235 2017-02-03 23:52

Number 70284937
 
Hello: I see that number 70284937 has expired [URL="https://www.mersenne.org/report_exponent/?exp_lo=70284937&full=1"]https://www.mersenne.org/report_exponent/?exp_lo=70284937&full=1[/URL] when it already had over 90% advance. This number took 90 days to get to 90%

If that number is not kept as a backup (I am assuming that work on that number completely ends when it expires) then that 90% work already done –and that could [I]probably[/I] be finished in between 5 and 10 days more– is lost.

So a new user is assigned and also assuming it gets it done in about 6 days why is not the expired work continued and then we have an instant double check?
If I am stating things correctly it would seem that the 90% advance will go to waste.

Would somebody elaborate?

ric 2017-02-04 11:30

[QUOTE=rudy235;452233]So a new user is assigned and also assuming it gets it done in about 6 days why is not the expired work continued and then we have an instant double check?[/QUOTE]

That's the norm. You might have noticed that - despite being expired on Feb the 3rd, the original assignee updated his progress on Feb 4th. Whoever comes in first, scores the LL result, the second puts in a DC - so no effort goes wasted, unless either one quits testing.

This has happened to me a few times, doing cat0/cat1 exponents: sometimes I went first, sometimes not.

And the juicier point "what if a number turns out to be prime" has already been addressed in some former discussion around here (IIRC, whoever comes first).

rudy235 2017-02-04 13:22

[QUOTE=ric;452251]You might have noticed that - despite being expired on Feb the 3rd, the original assignee updated his progress on Feb 4th. [/QUOTE]

Actually, (thanks for bringing that to my attention!) I had not noticed that. So we now have 96.6% completion and ideally then it will get done soon enough.

proxy2222 2017-03-04 09:13

First milestone of 2017
 
All exponents below 39,000,000 double-checked.:max:

Madpoo 2017-03-05 23:08

[QUOTE=proxy2222;454227]All exponents below 39,000,000 double-checked.:max:[/QUOTE]

Ah, I just checked. I figured it was happening this weekend but forgot to look yesterday.

Milestone page is now updated.

rudy235 2017-03-15 18:08

only one more to go for the 71 M milestone
 
[SIZE="3"]However that exponent [URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]70723879[/URL] as not been advancing a lot in its first week.

It might be between 16 and 17 days for it to finish. We'll have to wait and see if he picks up the pace.[/SIZE]

ATH 2017-03-15 20:42

Yeah, I tried to grab that one when it expired a week ago, but I missed it and that guy got it. Otherwise it would have been done by now.

GAPa 2017-03-15 22:04

[QUOTE=rudy235;454934][SIZE=3]However that exponent [URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]70723879[/URL] as not been advancing a lot in its first week.

It might be between 16 and 17 days for it to finish. We'll have to wait and see if he picks up the pace.[/SIZE][/QUOTE]I am the one who has received that exponent, but I'm not quite sure why it happened. I must have received it as a category 0 exponent, but if I understand [URL]https://www.mersenne.org/thresholds/[/URL] correctly, I only ought to receive category 1 exponents (except for DC tasks). There shouldn't be anything in my previous results indicating that I would complete this exponent in 15 days.

ATH 2017-03-15 22:20

[QUOTE=GAPa;454950]I am the one who has received that exponent, but I'm not quite sure why it happened. I must have received it as a category 0 exponent, but if I understand [URL]https://www.mersenne.org/thresholds/[/URL] correctly, I only ought to receive category 1 exponents (except for DC tasks). There shouldn't be anything in my previous results indicating that I would complete this exponent in 15 days.[/QUOTE]

You have 30 days for a category 0, so you have 23 days more, and since you did 30% in 7 days it should be fine:

[url]https://www.mersenne.org/assignments/?exp_lo=70723879&exp_hi=70723879[/url]

[url]https://www.mersenne.org/thresholds/[/url]



@Madpoo: Is the assignment page using the updates from default.mock.php now?

[url]https://www.mersenne.org/assignments/[/url]
[url]https://www.mersenne.org/assignments/default.mock.php[/url]

GAPa 2017-03-15 22:23

[QUOTE=ATH;454951]You have 30 days for a category 0, so you have 23 days more, and since you did 30% in 7 days it should be fine:

[URL]https://www.mersenne.org/assignments/?exp_lo=70723879&exp_hi=70723879[/URL]

[URL]https://www.mersenne.org/thresholds/[/URL][/QUOTE]Yes, I will be able to finish it before it expires, but I don't understand why I received it in the first place, since I don't think I fulfil the requirement 'Computer must have enough LL and DC GHz-days over the last 120 days to indicate the assignment will be completed in 15 days'.

chalsall 2017-03-15 22:35

[QUOTE=GAPa;454952]Yes, I will be able to finish it before it expires, but I don't understand why I received it in the first place, since I don't think I fulfil the requirement 'Computer must have enough LL and DC GHz-days over the last 120 days to indicate the assignment will be completed in 15 days'.[/QUOTE]

Maybe there's a bug in the Primenet code. This wouldn't be the first time.

Just finish the assignment as soon as you can. Meanwhile, the "back room guys" will try to figure out the mistake they made, and will hopefully learn from it.

Thankfully no lives were at risk in this case; some software has to be "Human Rated".

Prime95 2017-03-15 23:07

[QUOTE=GAPa;454952]Yes, I will be able to finish it before it expires, but I don't understand why I received it in the first place, since I don't think I fulfil the requirement 'Computer must have enough LL and DC GHz-days over the last 120 days to indicate the assignment will be completed in 15 days'.[/QUOTE]

The server thinks the computer has produced 24 LL results averaging 30 GHz-days/day over the last 120 days.

The only anomaly I see is that the server thinks there is only one worker window, but I'm seeing LL results reported about a day apart.

chalsall 2017-03-15 23:41

[QUOTE=Prime95;454957]The only anomaly I see is that the server thinks there is only one worker window, but I'm seeing LL results reported about a day apart.[/QUOTE]

And there is your bug. :smile:

GAPa 2017-03-16 05:19

[QUOTE=Prime95;454957]The only anomaly I see is that the server thinks there is only one worker window, but I'm seeing LL results reported about a day apart.[/QUOTE]Ah, I see. I have four workers, so that seems to explain it.

Dubslow 2017-03-16 08:27

[QUOTE=chalsall;454958]And there is your bug. :smile:[/QUOTE]

I'm inclined to agree. The rules clearly state per worker, and that's what the intended design was.

George, could you examine the details on my computer? GUID = 929FEE33E0FD3CC86AA18BBAB7314AC0 (that's not sensitive is it?)

Prime95 2017-03-16 15:02

[QUOTE=Dubslow;454974]I'm inclined to agree. The rules clearly state per worker, and that's what the intended design was.

George, could you examine the details on my computer? GUID = 929FEE33E0FD3CC86AA18BBAB7314AC0 (that's not sensitive is it?)[/QUOTE]


Four workers.

rudy235 2017-03-20 17:07

NEW MILESTONE REACHED
 
All exponents under 71'000,000 have been checked once.

chalsall 2017-03-20 17:23

[QUOTE=rudy235;455186]All exponents under 71'000,000 have been checked once.[/QUOTE]

Bummer... [URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]"The Prime Minister" poached it from "Albert Pettersson"[/URL] who was legitimately assigned it and was making reasonable progress.

George et al... Perhaps this will motivate you to implement the assignment rules as they were intended, to avoid this kind of thing in the future.

ric 2017-03-20 17:47

[QUOTE=rudy235;455186]All exponents under 71'000,000 have been checked once.[/QUOTE]

aka, the latest occurrence of [I]premature <el><el>-calculation[/I]... ah, emotions!

Not much in favour of strict rules, however I'd be grabbing popcorn, should one of these "emotions" turn out a success...

rudy235 2017-03-20 17:48

[QUOTE=chalsall;455188]Bummer... [URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]"The Prime Minister" poached it from "Albert Pettersson"[/URL] who was legitimately assigned it and was making reasonable progress.
[/QUOTE]

Yes, but if Albert Pettersson perseveres he'll make the double-check. There is someone else who is more advanced (83.3%) but seems to have abandoned the task!

chalsall 2017-03-20 18:04

[QUOTE=rudy235;455191]Yes, but if Albert Pettersson perseveres he'll make the double-check. There is someone else who is more advanced (83.3%) but seems to have abandoned the task![/QUOTE]

Yes... But...

Mr. Pettersson (AKA "GAPa") didn't understand why he was given this assignment. Based on all of our understanding of the assignment rules, he shouldn't have -- exactly for the reason which has just been empirically demonstrated.

Computers are harsh mistresses; they do exactly what you tell them to do (currently).

Madpoo 2017-03-20 19:49

[QUOTE=rudy235;455186]All exponents under 71'000,000 have been checked once.[/QUOTE]

Huh... noted and updated. Yeah, I wasn't expecting it yet and the poaching caught me off guard. :smile:

rudy235 2017-03-20 21:59

[QUOTE=Madpoo;455198] I wasn't expecting it ... poaching caught me off guard. :smile:[/QUOTE]


That _IS_ the definition of poaching! :bangheadonwall:

GAPa 2017-03-21 17:34

[QUOTE=chalsall;455188]George et al... Perhaps this will motivate you to implement the assignment rules as they were intended, to avoid this kind of thing in the future.[/QUOTE]Today, I received a category 1 assignment ([url]https://www.mersenne.org/M71737427[/url]), so perhaps the issue has been resolved now?

chalsall 2017-03-21 18:40

[QUOTE=GAPa;455241]Today, I received a category 1 assignment ([url]https://www.mersenne.org/M71737427[/url]), so perhaps the issue has been resolved now?[/QUOTE]

Possibly. Possibly not...

Cat 0's (the lowest 200 candidates) only become available occasionally. You just happened to get "lucky" with the assignment, and demonstrated a rarely executed flaw in the coded logic.

Within the software development industry this is sometimes referred to as a "Once a Month Bug". Absolutely no reference to some of our partner's moods... :smile:

ric 2017-03-21 23:02

[QUOTE=GAPa;455241]Today, I received a category 1 assignment.. so perhaps the issue has been resolved now?[/QUOTE]

In addition to the reply above, you might want to get in control and increase your setting for "days of work to queue" (menu item Option/Preferences) according to [URL="https://www.mersenne.org/thresholds/"]this page[/URL]

TL;DR put "4" in it, to be consistently served Cat 1's, "7" to be served Cat 2's, leave "3" or below, to continue as is

rudy235 2017-04-01 03:40

The 71,000,000 milestone
 
In some way the issue of the 71'000,000 milestone is not resolved.

The Prime Minister Residue is 42A93C867B552E__
while Albert Pettersson is 9C157058B13DB9__

Thus one of the two is not correct (or both :yucky: )

So I hope someone is able to take care of that by doing a triple check

[URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1[/URL]

flashjh 2017-04-01 04:03

[QUOTE=rudy235;455944]So I hope someone is able to take care of that by doing a triple check
[/QUOTE]

Running now, ETC 4 days

ATH 2017-04-01 04:12

[QUOTE=rudy235;455944]In some way the issue of the 71'000,000 milestone is not resolved.

The Prime Minister Residue is 42A93C867B552E__
while Albert Pettersson is 9C157058B13DB9__

Thus one of the two is not correct (or both :yucky: )[/QUOTE]

It is not the double check milestone that reached 71M :-) There are many many more exponents below 71M with 2 residues that does not match.

rudy235 2017-04-01 04:39

[QUOTE=ATH;455946] There are many many more exponents below 71M with 2 residues that does not match.[/QUOTE]

I sort of suspected that. But in this specific case it is [U]the very last [/U]number that came in, the one where the residues do not match.:ick:

Madpoo 2017-04-02 07:29

[QUOTE=rudy235;455949]I sort of suspected that. But in this specific case it is [U]the very last [/U]number that came in, the one where the residues do not match.:ick:[/QUOTE]

Statistically speaking, I would expect several thousands of tests below 71M to be wrong once the double checking gets to that point.

rudy235 2017-04-03 00:20

[QUOTE=Madpoo;456013]Statistically speaking, I would expect several thousands of tests below 71M to be wrong once the double checking gets to that point.[/QUOTE]

That is consistent with an approximately 1% failure rate.

science_man_88 2017-04-03 00:25

[QUOTE=rudy235;456072]That is consistent with an approximately 1% failure rate.[/QUOTE]

debatable 7000./primepi(71000000) gives back 0.0016.... which is under 0.2% it would take roughly 42000./primepi(71000000) to give back just over 1% using pari/gp so roughly 42000 prime exponents would have to be wrong for that to occur. edit: at least prior to thinking about which can be easily factored etc. taking out the ones with 2p+1 as a factor of 2^p-1 gives about 40000 needed.

rudy235 2017-04-03 00:35

[QUOTE=science_man_88;456073]debatable 7000./primepi(71000000) gives back 0.0016.... which is under 0.2% it would take roughly 42000./primepi(71000000) to give back just over 1% using pari/gp so roughly 42000 prime exponents would have to be wrong for that to occur. edit: at least prior to thinking about which can be easily factored etc. taking out the ones with 2p+1 as a factor of 2^p-1 gives about 40000 needed.[/QUOTE]

Ok what I did was work backwards. There are roughly 500,000- 540,000 exponents that have not been verified or that have already two different residues.
[U]Several[/U] means more than two, but not many. So, let's say several thousand means a range from 3,000 to 5,000

Then for that to be true you need a failure rate of between 0.6 % to 1.0%

Madpoo 2017-04-03 03:03

[QUOTE=rudy235;456076]Ok what I did was work backwards. There are roughly 500,000- 540,000 exponents that have not been verified or that have already two different residues.
[U]Several[/U] means more than two, but not many. So, let's say several thousand means a range from 3,000 to 5,000

Then for that to be true you need a failure rate of between 0.6 % to 1.0%[/QUOTE]

I didn't work out the math, but now in hindsight I should have gone with my gut instinct to say "tens of thousands" :smile:

rudy235 2017-04-03 03:56

[QUOTE=Madpoo;456081]I didn't work out the math, but now in hindsight I should have gone with my gut instinct to say "tens of thousands" :smile:[/QUOTE]

Thats the good thing of having a Forum. We get closer to the truth, one post at a time.

So what is the Percentage of Failure? (meaning by that that the first and the second result are not the same)

ric 2017-04-03 08:55

[QUOTE=rudy235;456088]So what is the Percentage of Failure? (meaning by that that the first and the second result are not the same)[/QUOTE]

On my (quite limited) sample of Madpoo's Strategic Double Checks, which is naturally a very biased sample, over 60%. YMMV.

VictordeHolland 2017-04-03 10:45

[QUOTE=rudy235;456088]
So what is the Percentage of Failure? (meaning by that that the first and the second result are not the same)[/QUOTE]
Historically 3-4% of the tests are bad, with a few spikes in ranges with >10% failure. Usually these are at FFT crossover points and the first 10,000,000 digits test (M33,000,000+)
[URL]http://mersenneforum.org/showpost.php?p=449908&postcount=102[/URL]

Dubslow 2017-04-04 00:37

[QUOTE=rudy235;456088]

So what is the Percentage of Failure? (meaning by that that the first and the second result are not the same)[/QUOTE]

This thread is devoted to the topic: [url]http://mersenneforum.org/showthread.php?p=449883#post449883[/url]

GP2 2017-04-04 07:53

[QUOTE=VictordeHolland;456106]Historically 3-4% of the tests are bad[/QUOTE]

It's worth mentioning that the 3–4% error rate is just an average over all machines. Many machines have a perfect or near-perfect record, while a few have very high error rates.

[QUOTE=VictordeHolland;456106]with a few spikes in ranges with >10% failure.[/QUOTE]

Actually, from the graph, the most salient feature is the set of regular [I]downward[/I] spikes, starting at around 20M and repeating four more times at even intervals just under 5M apart.

A bit of data mining might shed some light on this, but it's more fun to speculate idly. If this was a user or group of users with highly reliable machines doing LL tests ahead of time in order to test the software ahead of the wavefront, then you'd expect the intervals to be exactly 5M apart and exactly aligned with human-friendly thresholds like 20M, 25M, 30M, 35M and 40M; instead, they're just a bit off. But you wouldn't really expect these little islands of hyper-reliability to be an artifact of the algorithms either, so it seems likely to be some kind of selection effect.

VictordeHolland 2017-04-04 12:48

[QUOTE=GP2;456159]
Actually, from the graph, the most salient feature is the set of regular [I]downward[/I] spikes, starting at around 20M and repeating four more times at even intervals just under 5M apart.

A bit of data mining might shed some light on this, but it's more fun to speculate idly. If this was a user or group of users with highly reliable machines doing LL tests ahead of time in order to test the software ahead of the wavefront, then you'd expect the intervals to be exactly 5M apart and exactly aligned with human-friendly thresholds like 20M, 25M, 30M, 35M and 40M; instead, they're just a bit off. But you wouldn't really expect these little islands of hyper-reliability to be an artifact of the algorithms either, so it seems likely to be some kind of selection effect.[/QUOTE]
They roughly correspond with the underlined?
range - FFT
[U]29.69M[/U]-34.56M - 1792K
[U]34.56M[/U]-39.50M - 2048K
[U]39.50M[/U]-49.10M - 2560K
49.10M-58.52M - 3072K

GP2 2017-04-04 16:13

[QUOTE=VictordeHolland;456171]They roughly correspond with the underlined?
range - FFT
[U]29.69M[/U]-34.56M - 1792K
[U]34.56M[/U]-39.50M - 2048K
[U]39.50M[/U]-49.10M - 2560K
49.10M-58.52M - 3072K[/QUOTE]

Hmmm, mprime does some extra checking around crossover points, but I think this is mostly to see if it needs to go to a higher FFT size. Can it really be that this extra checking has the side effect of turning unreliable machines (overheated, overclocked, cheap parts, etc) into nearly perfectly reliable machines? If this were true, then the server and software should cooperate to turn on this extra checking all the time for machines that have a track record of being unreliable. A priori it would seem more likely that George or others do exponents near the crossover points before the wavefront reaches them, as a way of doing quality assurance of the software.

flashjh 2017-04-05 04:42

[QUOTE=rudy235;455944]In some way the issue of the 71'000,000 milestone is not resolved.

The Prime Minister Residue is 42A93C867B552E__
while Albert Pettersson is 9C157058B13DB9__

Thus one of the two is not correct (or both :yucky: )

So I hope someone is able to take care of that by doing a triple check

[URL="https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1"]https://www.mersenne.org/report_exponent/?exp_lo=70723879&full=1[/URL][/QUOTE]

Verified C from Albert Pettersson's run

rudy235 2017-04-05 21:37

[QUOTE=flashjh;456208]Verified C from Albert Pettersson's run[/QUOTE]

That is good. The poaching was 100% not worth it.

LaurV 2017-04-06 02:03

[QUOTE=rudy235;456088]Thats the good thing of having a Forum. We get closer to the truth, one post at a time.[/QUOTE]
I love this line! :bow:

Madpoo 2017-04-07 03:39

[QUOTE=rudy235;456088]Thats the good thing of having a Forum. We get closer to the truth, one post at a time.[/QUOTE]

True dat.

I had a bizarre experience at another forum (IPCamTalk) with one of the admins of the forum freaking out at me. I wish I'd paid more attention to the mood of the place before signing up; some of them (including the guy who runs the place) are borderline insane as it turns out.

I wound up getting banned, but seems I'm in good company... I think he just bans everyone he disagrees with (and then brags about his user base of 30K people). :smile: It was amusing to be accused of not knowing anything about networking. Entire threads exist where nearly everyone who posted has a big "Banned" note.

On the bright side, it made me appreciate the people here. Yeah, we have some characters (ahem... RD Silverman... ahem...) but even they're mostly harmless and give the place character. When new folks show up and ask the same questions, Mike doesn't tear them a new one if they're confused about anything or ask a question that's in a FAQ somewhere.

Point being, we're better than them. LOL

rudy235 2017-04-07 16:37

[QUOTE=Madpoo;456322]

On the bright side, it made me appreciate the people here. Yeah, we have some characters (ahem... RD Silverman... ahem...) but even they're mostly harmless and give the place character. When new folks show up and ask the same questions, Mike doesn't tear them a new one if they're confused about anything or ask a question that's in a FAQ somewhere.

Point being, we're better than them. LOL[/QUOTE]

Yes I do love this forum. To make things clear I am no mathematician and I no longer contribute to GIMPS, however I am very enthusiastic about very large prime numbers. I find it amazing that with methods like ECPP we can prove primality of any form of random number under 30,000 digits. We now have about 120,000 primes over 1000 digits. Up from about 170 known only 33 years ago. At that time the largest Mersenne (now M30) had less than 40,000 digits.

What will the future bring? The first Sierpinsky prime over 10,000,000 digits should be coming soon, as well as the first Mersenne over the Classic limit of 79.3 million digits.:whee::cry:

Yes everyone here is very helpful and I for one I'm not afraid of asking stupid questions, although of course I try to make them sound intelligent.

rudy235 2017-04-07 18:26

...I meant to say with an exponent over 79.3 million.


All times are UTC. The time now is 21:12.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.