mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Data (https://www.mersenneforum.org/forumdisplay.php?f=21)
-   -   Newer milestone thread (https://www.mersenneforum.org/showthread.php?t=13871)

kracker 2014-11-12 01:14

[QUOTE=retina;387446]Allowing everyone else to run arbitrary unsanctioned code on your computer. How can that be considered a good thing? Even though we trust madpoo and George - servers can be compromised, connections can be hijacked, ISPs can (and do) insert their own things, etc.[/QUOTE]

Then why are you using a OS?

Madpoo 2014-11-12 02:43

[QUOTE=retina;387446]Allowing everyone else to run arbitrary unsanctioned code on your computer. How can that be considered a good thing? Even though we trust madpoo and George - servers can be compromised, connections can be hijacked, ISPs can (and do) insert their own things, etc.[/QUOTE]

You trust me? Oh you poor fool. LOL You must not be aware of my nefarious past. :smile:

On the flip side, a long while back my bro and I had a funny and not really serious idea to make a javascript version of Prime95... wouldn't it be funny to put that on the mersenne.org website so every visitor runs an ECM curve or trial factors a smidgen, or does an iteration or two of LL? Yeah, it was a goofy thought, but it would have been funny.

retina 2014-11-12 02:56

[QUOTE=Madpoo;387452]You trust me? Oh you poor fool. LOL You must not be aware of my nefarious past. :smile:[/QUOTE]Anyone without a stain in their past record is not to be trusted for anything, they'll sell you out in an instant.[QUOTE=Madpoo;387452]On the flip side, a long while back my bro and I had a funny and not really serious idea to make a javascript version of Prime95... wouldn't it be funny to put that on the mersenne.org website so every visitor runs an ECM curve or trial factors a smidgen, or does an iteration or two of LL? Yeah, it was a goofy thought, but it would have been funny.[/QUOTE]Hehe indeed. Kind of says a lot about the trustworthiness of JS. There were (perhaps there still are) people mining bitcoins using JS in a webpage in an effort to get free resources from unsuspecting visitors.

Xyzzy 2014-11-12 02:57

How crippled is the forum without JS? (Just curious!)

:mike:

retina 2014-11-12 03:07

[QUOTE=Xyzzy;387454]How crippled is the forum without JS? (Just curious!)[/QUOTE]Works for me.

Uncwilly 2014-11-12 03:53

[QUOTE=retina;387369]How about we make a deal. If you get rid of the internal table horiz scrollbar on the recent cleared/results reports then I won't complain about the estimated completion date. :deadhorse::spinner::whistle:[/QUOTE]
:direction:
Again.

chris2be8 2014-11-12 17:25

[QUOTE=Xyzzy;387454]How crippled is the forum without JS? (Just curious!)

:mike:[/QUOTE]

In some ways it's better. In particular it's easier to tab from one link to the next with JS disabled (I like to read one thread, alt-leftarrow back to the forum, tab to next thread, etc, as much as possible). This comes from starting my career before mice were commonplace, so I resent having to do things with a mouse when I've already learnt to do it on the keyboard faster.

Chris

Madpoo 2014-11-12 23:56

[QUOTE=Uncwilly;387458]
Again.[/QUOTE]

Sigh...okay.

Back OT: Notice how the milestone page now has the dates and events aligned? I just stuck them all in a table with zero for a border.

Speaking of the page, so I guess the next minor milestone we'll cross is that double-check of all Mp under 10M digits. Any thoughts on what could or should be next?

We're still at a spot where not all of the exponents below M(57885161) have been assigned for a first-time check, so you can't really get a report of those or do even a wild guess on an ETA.

And speaking of the 10M milestone... I found myself going back and forth on the nomenclature. What's the proper, or just best, way of phrasing it? "All exponents below 10M" isn't right, because it's not the exponent, it's the 2^(exponent)-1 that's 10M digits. I think I've used that phrasing as shorthand here and there although I know it's not correct. Is it just like "M(P)", or is there some generally accepted shorthand?

As you see, I stuck with the very literal format of "all 2^p-1 below 10M digits", it just doesn't roll off the tongue. Perhaps "all Mersenne #'s less than 10M digits" ?

Uncwilly 2014-11-13 01:10

[QUOTE=Madpoo;387504]Speaking of the page, so I guess the next minor milestone we'll cross is that double-check of all Mp under 10M digits. Any thoughts on what could or should be next?.[/QUOTE]I noticed some progress on the first time LL number. I think that there is room for a [FONT="Book Antiqua"][B]Responsible Party[/B][/FONT]™ to do 3 "early double checks", that would move the move the 1LL number up past 2 xx,000,000 milestones.
[url]http://www.mersenne.org/assignments/?exp_lo=51000000&exp_hi=53000000&execm=1&exdchk=1&B1=Get+Assignments[/url]
Maybe someone can monitor these for a while. If there is low progress, maybe Chris can be summoned to perform a GIMPS "Christmas Miracle".

[URL="http://www.mersenne.org/assignments/?exp_lo=53000000&exp_hi=54000000&execm=1&exdchk=1&B1=Get+Assignments"]The next xx,000,000 milestone[/URL] after that would likely take until at least the end of March.

Prime95 2014-11-13 01:25

[QUOTE=Uncwilly;387511]I noticed some progress on the first time LL number.[/QUOTE]

Two points:

1) I think we need to rename the "active assignments" page to the "poach me" page.

2) For years I've wanted to eliminate the "all exponents tested once" milestones. To me, the milestone is pointless as it is a virtual certainty that there are many bad LL results. Thus some exponents are in reality untested and possibly prime. My opinion, however, is in the minority and many users feel such events are important.

flagrantflowers 2014-11-13 02:50

[QUOTE=Prime95;387514]
…we need to rename the "active assignments" page to the "poach me" page.[/QUOTE]

I agree. I don't see why they shouldn't be left alone at least until the older assignment rules no longer apply.

If you really want to see those "once tested" milestones pass just think about saving up X of them and completing X in a short time period.

Uncwilly 2014-11-13 05:09

[QUOTE=Prime95;387514]Two points:

1) I think we need to rename the "active assignments" page to the "poach me" page.[/QUOTE]Just for the record, I personally have not done an LL in at least 6 years (unless one of my borgs did some unsupervised), maybe 8. And the last LL's that I did do were DC assignments handed out by the server.

Primeinator 2014-11-13 06:43

[QUOTE=Madpoo;387504]

And speaking of the 10M milestone... I found myself going back and forth on the nomenclature. What's the proper, or just best, way of phrasing it? "All exponents below 10M" isn't right, because it's not the exponent, it's the 2^(exponent)-1 that's 10M digits. I think I've used that phrasing as shorthand here and there although I know it's not correct. Is it just like "M(P)", or is there some generally accepted shorthand?

As you see, I stuck with the very literal format of "all 2^p-1 below 10M digits", it just doesn't roll off the tongue. Perhaps "all Mersenne #'s less than 10M digits" ?[/QUOTE]


What about "Mersenne candidates below 10 million decimal digits"

[QUOTE=Prime95;387514]
2) For years I've wanted to eliminate the "all exponents tested once" milestones. To me, the milestone is pointless as it is a virtual certainty that there are many bad LL results. Thus some exponents are in reality untested and possibly prime. My opinion, however, is in the minority and many users feel such events are important.[/QUOTE]

I kind of like this milestone marker even though I know some of the LLs that have been 'tested' are faulty. It provides a moderately good estimate of the progress of the project as a whole (even though it is nowhere near the LL wave front).

Brian-E 2014-11-13 09:56

[QUOTE=Prime95;387514]2) For years I've wanted to eliminate the "all exponents tested once" milestones. To me, the milestone is pointless as it is a virtual certainty that there are many bad LL results. Thus some exponents are in reality untested and possibly prime. My opinion, however, is in the minority and many users feel such events are important.[/QUOTE]

[QUOTE=Primeinator;387529]I kind of like this milestone marker even though I know some of the LLs that have been 'tested' are faulty. It provides a moderately good estimate of the progress of the project as a whole (even though it is nowhere near the LL wave front).[/QUOTE]
Perhaps the solution is to rename this particular milestone so that casual readers see the "progress" aspect of it rather than the erroneous suggestion of completion which the word "tested" can conjure up.

Not easy to word it though. Somthing like "all exponents below XXXXXXXX have undergone their initial unconfirmed LL run" perhaps? Can anyone make it sound less clumsy?

ATH 2014-11-13 14:40

[QUOTE=Prime95;387514]1) I think we need to rename the "active assignments" page to the "poach me" page.[/QUOTE]

The 3 exponents he linked should have been recycled by now according to the rules?

[QUOTE]Recycle rules (top 1500 exponents):
1) If expected completion date is not updated for 60 days.
2) If assignment made [I][B]before 2014-02-01[/B][/I] and:
2a) assignment is 12 months old and < 50% complete it is recycled.
2b) [I][B]assignment is 15 months old it is recycled.[/B][/I]
3) If assignment made after 2014-02-01 and:
3a) assignment is 6 months old and not started it is recycled.
3b) assignment is 9 months old it is recycled.[/QUOTE]

Prime95 2014-11-13 15:44

The actual SQL code is below. As you can see there is no 15 month rule. I'm about to head out the door, perhaps you can calculate when these three might expire.

[CODE]
((dt_when_assigned < '2014-03-01' AND -- Grandfathered assignment
exponent < @exp1 AND -- exponent is in the most critical category
dt_when_assigned < DATEADD (DAY, -365, GETDATE()) AND -- and assignment is over a year old
percent_done < 10 + (DATEDIFF (DAY, dt_when_assigned, GETDATE()) - 365) / 3.33) OR -- plus a grace period if close to finished
[/CODE]

Madpoo 2014-11-14 01:24

[QUOTE=Prime95;387555]The actual SQL code is below. As you can see there is no 15 month rule. I'm about to head out the door, perhaps you can calculate when these three might expire.

[CODE]
((dt_when_assigned < '2014-03-01' AND -- Grandfathered assignment
exponent < @exp1 AND -- exponent is in the most critical category
dt_when_assigned < DATEADD (DAY, -365, GETDATE()) AND -- and assignment is over a year old
percent_done < 10 + (DATEDIFF (DAY, dt_when_assigned, GETDATE()) - 365) / 3.33) OR -- plus a grace period if close to finished
[/CODE][/QUOTE]

In the case of those 3 exponents, they're not moving fast, but they are moving, and being updated every so often. I wouldn't think the server would reassign exponents as long as the original assignee was making progress on it, no matter how slowly.

I know that from a human point of view, milestones are interesting... those nice round numbers like 52 or 53 million, and having all exponents first time LL checked. But of course that's just a mental thing and in the grand scheme of things it doesn't matter much.

I can see both sides of the argument though... yeah, it doesn't really mean much... the work will get done when it gets done, but on the other hand if people are able to measure some artificial measure of progress, it gives that warm fuzzy that "things are happening" and hopefully keeps them engaged in the project as a whole.

If all we ever did was crunch numbers and never saw how that's moving the bar ever higher, it could give you that feeling that your effort doesn't matter and you don't bother keeping it going. I gather that's probably behind the whole idea of being able to form teams, or tracking and ranking individual work effort in the first place, because you can get that idea of "I'm doing something" or "we're doing something" :smile:

ATH 2014-11-14 05:57

[QUOTE=Prime95;387555]The actual SQL code is below. As you can see there is no 15 month rule. I'm about to head out the door, perhaps you can calculate when these three might expire.

[CODE]
((dt_when_assigned < '2014-03-01' AND -- Grandfathered assignment
exponent < @exp1 AND -- exponent is in the most critical category
dt_when_assigned < DATEADD (DAY, -365, GETDATE()) AND -- and assignment is over a year old
percent_done < 10 + (DATEDIFF (DAY, dt_when_assigned, GETDATE()) - 365) / 3.33) OR -- plus a grace period if close to finished
[/CODE][/QUOTE]

I just quoted your rules from the "Proposed LL assignment and recycle rules"-thread. I guess they changed along the way, I did not read all the 97 post in that thread.

It was not a demand or statement, but a question why they were not recycled, by which I meant, what are the rules now? Which I got my answer to.

From the code it seems they get 3.33 days extra beyond one year for every procent that are complete above 10%. That means they are very close to being recycled, unless they fall into the unspecified grace period:
1st one: 63.10% done, so will be recycled after: (63.1-10)*3.33 + 365 days ~ 549 days (currently at 536)
2nd and 3rd ~ 81% done, so will be recycled after: (81-10)*3.33 + 365 days ~ 601 days (currently at 576)

lycorn 2014-11-14 14:06

[QUOTE=Madpoo;387585]

If all we ever did was crunch numbers and never saw how that's moving the bar ever higher, it could give you that feeling that your effort doesn't matter and you don't bother keeping it going. I gather that's probably behind the whole idea of being able to form teams, or tracking and ranking individual work effort in the first place, because you can get that idea of "I'm doing something" or "we're doing something" :smile:[/QUOTE]

Very good point, nicely put. I guess it applies to many of us around here.

That said, I think that now, that new and more strict rules are in place to reassign exponents,we should also do something to prevent poaching in a more effective way. In fact, in the past we could say that the server wasn´t "doing its job properly" in that the supposed reassignments, under the old rules, were not taking place and milestones were systematically blocked. On the other hand, the server would (and still will) accept and credit any result, regardless of the assignment status of the exponent. I propose that [U]from now on the server will not accept results for an exponent that is reserved by someone else[/U]. That would be the end of poaching. Period.

Any thoughts?

Prime95 2014-11-14 14:15

[QUOTE=lycorn;387614]On the other hand, the server would (and still will) accept and credit any result, regardless of the assignment status of the exponent. I propose that [U]from now on the server will not accept results for an exponent that is reserved by someone else[/U]. That would be the end of poaching. Period.[/QUOTE]

Since the beginning, I've accepted results no matter what because I felt the math research was more important than the personal credit.

My big hope is that once the grandfathered assignments are out of the way, the desire to poach will be greatly reduced.

As for ideas, one would be for the server to block reporting on the lowest 50 assignments for both DC and LL. This would make it more difficult to find the candidates that are holding up a milestone.

tha 2014-11-14 14:38

[QUOTE=Prime95;387615]My big hope is that once the grandfathered assignments are out of the way, the desire to poach will be greatly reduced.[/QUOTE]

Once the new rules were in place I gave up minding the milestone blockers. I only gather some data just to see if the rules should be adjusted, but it will take about another year before enough data has been gathered.

VictordeHolland 2014-11-14 15:28

To me poaching LL assignments is different from poaching DC assignments.
With poaching LLs you're denying somebody their (small) chance of finding a prime.
People who do DC work know that the candidates they are doing are already checked and not prime (incredibly tiny chance of incorrect residue AND it being prime).

chalsall 2014-11-14 16:08

[QUOTE=Prime95;387615]As for ideas, one would be for the server to block reporting on the lowest 50 assignments for both DC and LL. This would make it more difficult to find the candidates that are holding up a milestone.[/QUOTE]

Another idea (proposed many times) would be that any credit given for a "poached" assignment be given to the original assignee. Including (the /very/ unlikely) newly found Mersenne Prime.

This might satisfy slow "credit whores", and those who simply want to see the waves advance.

Brian-E 2014-11-14 17:11

[QUOTE=lycorn;387614]Very good point, nicely put. I guess it applies to many of us around here.

That said, I think that now, that new and more strict rules are in place to reassign exponents,we should also do something to prevent poaching in a more effective way. In fact, in the past we could say that the server wasn´t "doing its job properly" in that the supposed reassignments, under the old rules, were not taking place and milestones were systematically blocked. On the other hand, the server would (and still will) accept and credit any result, regardless of the assignment status of the exponent. I propose that [U]from now on the server will not accept results for an exponent that is reserved by someone else[/U]. That would be the end of poaching. Period.

Any thoughts?[/QUOTE]

[QUOTE=Prime95;387615]Since the beginning, I've accepted results no matter what because I felt the math research was more important than the personal credit.

My big hope is that once the grandfathered assignments are out of the way, the desire to poach will be greatly reduced.

As for ideas, one would be for the server to block reporting on the lowest 50 assignments for both DC and LL. This would make it more difficult to find the candidates that are holding up a milestone.[/QUOTE]
How about simply blocking the reporting of the results of poached assignments (until the original assignee finishes it or the assignment expires)? Store them, but don't show them.

[QUOTE=VictordeHolland;387621]To me poaching LL assignments is different from poaching DC assignments.
With poaching LLs you're denying somebody their (small) chance of finding a prime.
People who do DC work know that the candidates they are doing are already checked and not prime (incredibly tiny chance of incorrect residue AND it being prime).[/QUOTE]
I must respectfully disagree with any pecking order of degrees of poaching. It's all the same to me. Incidentally I do DCs exclusively nowadays for the simple reason that my machine is too slow to contemplate first time tests in the regions where these are now being handed out. I still hold my breath when my latest DC is about to finish to see if it turns out prime.

chalsall 2014-11-14 17:28

[QUOTE=Brian-E;387629]I must respectfully disagree with any pecking order of degrees of poaching. It's all the same to me. Incidentally I do DCs exclusively nowadays for the simple reason that my machine is too slow to contemplate first time tests in the regions where these are now being handed out. I still hold my breath when my latest DC is about to finish to see if it turns out prime.[/QUOTE]

I respectfully disagree with your disagreement.

Those who "take the piss" (read: take YEARS to complete an assignment) should expect to be poached.

"It's not fair!". "Life isn't fair. Deal with it." (Sound advice from a close relative.)

Brian-E 2014-11-14 18:26

[QUOTE=chalsall;387630]I respectfully disagree with your disagreement.

Those who "take the piss" (read: take YEARS to complete an assignment) should expect to be poached.

"It's not fair!". "Life isn't fair. Deal with it." (Sound advice from a close relative.)[/QUOTE]
Well, you're not really responding to what I wrote now, are you?:smile:

Primeinator 2014-11-14 19:38

Is it possible to make there be a penalty for poaching?

I like what George said about the math being more important than individual credit. I think the server should still accept results BUT in the case of poached exponents the credit should be given to the person that originally had the exponent assigned (as chalsall said) AND that much credit should be deducted from the individual who did the poaching.

Edit: Or if you wanted to be really harsh to the poacher then you could have half of their credit deducted for the next x assignments completed or for all assignments completed in the next y span of time.

Edit 2: I am now hungry for poached eggs.

chalsall 2014-11-14 19:54

[QUOTE=Primeinator;387641]Edit 2: I am now hungry for poached eggs.[/QUOTE]

Mmmm... Poached eggs, with smoked salmon, on a bagal. Yummm! :smile:

Primeinator 2014-11-14 20:01

[QUOTE=chalsall;387642]Mmmm... Poached eggs, with smoked salmon, on a bagal. Yummm! :smile:[/QUOTE]

I've never had this combination... and it sounds utterly life-changing :w00t:

Chuck 2014-11-14 20:25

I support strict law enforcement...
 
When the grandfathered assignments run out, I'd like to see the new rules we all agreed to strictly enforced. If it is a category 1 DC, and 60 days have elapsed, whether it is 20% or 99.5% complete, it gets recycled. Period.

Madpoo 2014-11-14 20:37

[QUOTE=chalsall;387625]Another idea (proposed many times) would be that any credit given for a "poached" assignment be given to the original assignee. Including (the /very/ unlikely) newly found Mersenne Prime.

This might satisfy slow "credit whores", and those who simply want to see the waves advance.[/QUOTE]

That's not a bad compromise... Basically someone who just can't stand having that one outstanding assignment that keeps some milestone from being reached, yeah, they can go ahead and "donate" their own computer time to that other person who actually has the assignment, giving that original assignee the credit.

The only thing is, it kind of sucks if you poach someone's double-check... I mean, if you poach a first time, then at least when the original computer checks in, then hey, it's a double-check, which is cool and all. But if you poach a double-check, then we'd assume the original assignee would check it in at some point which would usually be a useless triple-check. It would have been better if the poacher just got a regular double-check assignment and let that straggler come in on it's own.

This all assumes work is being done on the assignment and the recycling rules don't have anything that lets things fall through the cracks. I'm with George though in thinking that once the grandfathered work finally peters out, it'll improve things in general.

wombatman 2014-11-14 20:46

How about a sliding scale corresponding to the length of time it's been reserved? That is, if you poach one that's been reserved for 10 days, most of the credit goes to the original reserver. If it's been 300 days, the poacher gets most of the credit, but some goes to the original person. You don't want to give all credit to the original reserver so as to discourage people from reserving and then doing nothing, but you also don't want to have people snatching recently reserved exponents.

chalsall 2014-11-14 21:54

[QUOTE=Primeinator;387643]I've never had this combination... and it sounds utterly life-changing :w00t:[/QUOTE]

It is. If you haven't, do.

TheMawn 2014-11-14 23:21

I agree that the "Assignments below X tested once" milestone is important for showing progress, and this is the reason we even have the milestones to begin with.


Regarding the poaching stuff: Why does a rational person poach assignments? Nothing we do is going to change the behaviour of irrational folk anyway, so let us ignore those.

A rational person will poach an assignment to steal the chance of finding the prime or to make progress on a milestone. Credits are irrelevant because short of knowing the residue beforehand, you had to put in the GHz-Days to get the result anyway.

Giving all credit to the poach[B]ed[/B] is a good start. Denying that the poach[B]ee[/B] ever did anything will be a deterrent to anyone trying to steal the prime discovery. "I found it! I found it!" "Nope. Our records do not show that you did." It will also be a deterrent to anyone who is concerned with getting as many credits as possible and does not wish to give away "free CPU work".

Anyone still not intimidated by these downsides will probably be submitting Anonymously anyway, so the credit penalty will likely not scare anyone.

The only rational people left, then, are the ones purely seeking milestone progress. The rules we put in are going to kick in fully in three months and their effect is intended to say that all 1,500 (is that correct) of the assignments most relevant to a particular milestone will be finished within a short time (2 and 6 months I believe for DC and LL respectively).

The only way to deter anyone who is STILL too impatient is to refuse the submission of poached work (a BIG no-no) altogether, or as George suggested, to hide the details of the front X assignments so that nobody knows which 33 (for example) assignments are holding up the queue. I want to say that this is going to be unnecessary with the new rules at full power but perhaps it is worth withholding data from the public.

tha 2014-11-14 23:21

Let us all wait now till the assignments under the old rules expire, which will be soon anyway. The revolt that lead to the new rules was started because of more than two years of backlog piled up in huge amounts. The current amount of backlog is only a tiny fraction of that. Time is on 'our' side now, so the best thing to do is wait and spent our resources elsewhere. I don't think we need any new rules now.

lycorn 2014-11-14 23:45

I agree with you. From Feb/2015 on, the "excuse" for poaching will be virtually none, providing the rules will be really enforced. For me that´s a very important point. We definitely need a strict enforcement of the agreed rules. And, if and when they are enforced, I reckon George´s argument about math wil no longer really matter. The assignment will be finished sooner than later, if not by Mr X (the poacher, whose result was refused), by Mr. Y (the one who gets the reassigned exponent in a reasonable time, due to the new rules). That is to say that I insist that poached assignments should not be accepted by the server, once the grace period is over and the new rules are in full swing.

ATH 2014-11-15 00:58

@Madpoo: Btw this is a good option for another milestone: "Number of Grandfathered assignments left" :smile: Or to be more diplomatic: "Number of exponents assigned prior to 2014-03-01 (or March 1st, 2014)".


[QUOTE=lycorn;387670]I agree with you. From Feb/2015 on, the "excuse" for poaching will be virtually none, providing the rules will be really enforced. [/QUOTE]

Actually the last grandfathered exponent will last much longer. According to George's code an exponent assigned 2014-02-28 and say 90% done will live for 365 + (90-10)*3.33 = 631 days (Nov 21st 2015) (and one 90% done will probably fall under the unspecified grace period because it is almost done).

Uncwilly 2014-11-15 01:12

[QUOTE=tha;387665]Let us all wait now till the assignments under the old rules expire, which will be soon anyway. The revolt that lead to the new rules was started because of more than two years of backlog piled up in huge amounts.[/QUOTE]
About my original post about the 3 exponents in question:
I mentioned monitoring them to see if they really will complete in a reasonable amount of time, before acting. Most seem to agree that waiting and watching these is good.
A prior poster pointed out that all three [I]should[/I] be finished before the end of the year. That is the time frame that I mentioned.
These are first timers, thus any work turned in by someone other than Anon and the other could be viewed as an early DC, with no loss to those that hold the assignment.
I suggested a particular person of good repute and good machines as the designated actor. Firstly, they could watch the progress of the original assignee and hold off their report. Secondly, they are not a credit whore, no issue there. Thirdly, if on the tiny-tiny chance that one of the numbers is the next MP, they could communicate that to George, privately, letting the assignee be the one to report it to PrimeNet. I feel they are such character that they would 'do the right thing' in the circumstance. If one person is semi-sanctioned to do this, then everyone else should not be tempted and no undue worry or extra effort will occur.
The only reason I made the suggestion was to make it all nice and tidy at the end of the year.

The 10M digit DC milestone is nice, but that and the future ones will be taken care of soon enough by the new assignment rules.

chalsall 2014-11-15 15:44

[QUOTE=Uncwilly;387676]I suggested a particular person of good repute and good machines as the designated actor. ... If one person is semi-sanctioned to do this, then everyone else should not be tempted and no undue worry or extra effort will occur.[/QUOTE]

I don't know if you were thinking of me as the "designated actor", but if no one complains, I'd like to take on this role. My rational is:

1. 51907363 should have already been recycled according to the above SQL.

1.1. Beren has only progressed 0.3% since 2014-05-01.

2. 52957519 and 52983583 should be recycled around 2014-12-10, and they have only progressed ~10% since 2014-05-17.

Objections? I could process all three in just over three days on machines which have 100% success rate DC'ing.

ATH 2014-11-15 16:29

[QUOTE=chalsall;387721]1. 51907363 should have already been recycled according to the above SQL.[/QUOTE]

I do not mind or complain, just want to point out it should not be recycled until Nov 19th or 20th:
2013-05-27 + 365 + (63.1-10)*3.33 ~ 2013-05-27 + 541 = 2014-11-19

chalsall 2014-11-15 16:36

[QUOTE=ATH;387723]I do not mind or complain, just want to point out it should not be recycled until Nov 19th or 20th:
2013-05-27 + 365 + (63.1-10)*3.33 ~ 2013-05-27 + 541 = 2014-11-19[/QUOTE]

Whoops -- you're correct. I erroniously calculated based on the assignment date being 2013-05-07 rather than 2013-05-27.

But, regardless, considering it's current rate of progress, it is definitely going to be recycled before it's completed.

Madpoo 2014-11-15 18:31

[QUOTE=ATH;387675]@Madpoo: Btw this is a good option for another milestone: "Number of Grandfathered assignments left" :smile: Or to be more diplomatic: "Number of exponents assigned prior to 2014-03-01 (or March 1st, 2014)"...[/QUOTE]

I was curious about that myself last night and had already crunched the #'s:

29598 TF assignments
1 P-1 assignment
31 ECM assignments
4354 first time LL
155 double check LL

These are all assignments made prior to 2014-03-01 that have not yet expired.

The large # of TF assignments were surprising, and from what I can see it appears a vast majority haven't even started yet. Only 66 of those TF assignments have reported any "percent done".

Over 16,000 of those where the exponent is in the 33M-37M range. About 300 in the 65M-110M range, and then the remainder are all 332M+ exponents with due dates in the year 2031, a mere 17 years from now.

On the LL/DC side of things, whew, there are some dedicated people out there.

The absolute oldest assignment is from 2008-11-15 (6 years ago!) on exponent 332210191. But the assignee has been testing it, slowly but surely. It's at 64.1% done, and it's being updated regularly on it's progress.

Of the 155 DC assignments, 137 of them have reported some progress on theirs... not all of them recent though. I think I could safely say some of them are clearly abandoned, where they've reported some work done but haven't updated themselves in quite a while... back in April for some that did some work, and some haven't checked in at all since it was first assigned back in 2013. But still, a good majority of them (over 100) have checked in recently, in the past couple weeks.

For the 4354 LL assignments, 1645 of them have reported some progress... again, not all of them recently. And some of those, the progress reported so far is still in the TF phase of the test, they haven't even begun the LL tests.

The rest of them haven't reported any progress at all since being assigned.

But I think there is some overly optimistic assignments... this one amused me: [URL="http://www.mersenne.org/assignments/?exp_lo=64653773"]http://www.mersenne.org/assignments/?exp_lo=64653773[/URL]

Expected completion of April 1, 2078. Is this just a really, super early April Fools Day joke? :smile: But hey, it's 1.2% done despite being assigned nearly a year ago... and whoever this is assigned to (an anonymous user) has been checking in regularly... just earlier today in fact.

Out of the 2709 exponents where 0% progress has been done, again, there's quite a few that, from my human eyes, seem clearly abandoned. They haven't checked since being assigned, or they were last updated quite a bit ago. A good # are exponents > 332M, so they got themselves a record breaking assignment but gave up.

A breakdown of those 1st time LL with no progress at all:
[CODE]Exponent Count
< 100M = 145
100M-200M = 100
200M-300M = 101
300M + = 2402
[/CODE]

Madpoo 2014-11-15 19:02

[QUOTE=Uncwilly;387511]I noticed some progress on the first time LL number. I think that there is room for a [FONT="Book Antiqua"][B]Responsible Party[/B][/FONT]™ to do 3 "early double checks", that would move the move the 1LL number up past 2 xx,000,000 milestones.
[url]http://www.mersenne.org/assignments/?exp_lo=51000000&exp_hi=53000000&execm=1&exdchk=1&B1=Get+Assignments[/url]
Maybe someone can monitor these for a while. If there is low progress, maybe Chris can be summoned to perform a GIMPS "Christmas Miracle".

[URL="http://www.mersenne.org/assignments/?exp_lo=53000000&exp_hi=54000000&execm=1&exdchk=1&B1=Get+Assignments"]The next xx,000,000 milestone[/URL] after that would likely take until at least the end of March.[/QUOTE]

By the way, I checked the assignments just to make sure, and all 1st time LL exponents below 54M have indeed been assigned, so this report here will in fact show the exponents we're waiting on to get all the way up to "all exponents below 54M have been checked once"
[URL="http://www.mersenne.org/assignments/?exp_lo=2&exp_hi=54000000&execm=1&exdchk=1&exp1=1&extf=1"]Remaining exponents below 54M[/URL]

When it comes to unassigned exponents, we have to look in the 56M+ range before we start finding some that haven't been handed out to anyone. There are 4920 unassigned exponents below M(57885161).

Here's the smallest unassigned 1st time LL check if anyone cares: [URL="http://www.mersenne.org/M56155111"]http://www.mersenne.org/M56155111[/URL]

ATH 2014-11-15 19:06

[QUOTE=Madpoo;387726]For the 4354 LL assignments, 1645 of them have reported some progress... again, not all of them recently. And some of those, the progress reported so far is still in the TF phase of the test, they haven't even begun the LL tests.[/QUOTE]

Even for the grandfathered exponent under say 60M or maybe 70M, we should have some threshold of 90 or 120 days limit since last report or they are recycled?

Btw if you add this milestone to the milestone page, just add the LL and DC? Skip the TF, P-1 and ECM, those are not that interesting?



[QUOTE=Madpoo;387729]When it comes to unassigned exponents, we have to look in the 56M+ range before we start finding some that haven't been handed out to anyone. There are 4920 unassigned exponents below M(57885161).[/QUOTE]

I believe my computer are in Cat 1, at least back in March or earlier I was getting <50M exponents, and now I'm getting 57.5M exponents.

Prime95 2014-11-15 19:41

[QUOTE=chalsall;387721]Objections? I could process all three in just over three days on machines which have 100% success rate DC'ing.[/QUOTE]

If the exponent comes up for recycling while you are processing it, then it will be reassigned to someone else -- resulting in a triple-check.

Madpoo 2014-11-15 19:44

New milestone added
 
I just added a new entry to the milestone page... a "countdown" to the 1st time LL checks on exponents below 56M

It wasn't worth adding milestones for the 52M/53M/54M/55M milestones because there really aren't that many... just a few in each of those ranges from those grandfathered slowpokes.

Enjoy! Or, as George might say, "poaching season is now open!" :smile: (of course anyone could have run that report on the assignment page at any time)

Prime95 2014-11-15 19:48

[QUOTE=Madpoo;387726]
A breakdown of those 1st time LL with no progress at all:
[CODE]Exponent Count
< 100M = 145
100M-200M = 100
200M-300M = 101
300M + = 2402
[/CODE][/QUOTE]


I believe the server does not recycle exponents above 100M.

Prime95 2014-11-15 19:51

[QUOTE=ATH;387730]I believe my computer are in Cat 1, at least back in March or earlier I was getting <50M exponents, and now I'm getting 57.5M exponents.[/QUOTE]

57.5M means you are in Cat 2

chalsall 2014-11-15 21:56

[QUOTE=Prime95;387738]57.5M means you are in Cat 2[/QUOTE]

Just for the record, to paraphase Paul Hogan, "The Shrimp are on the barbie".

To quote the amazing Robin Williams, "Pack a lunch; stay for the day".

Luis 2014-11-16 08:19

[QUOTE=Madpoo;387726]Expected completion of April 1, 2078. Is this just a really, super early April Fools Day joke? :smile: But hey, it's 1.2% done despite being assigned nearly a year ago... and whoever this is assigned to (an anonymous user) has been checking in regularly... just earlier today in fact.[/QUOTE]
I did something like this a couple of years ago. I had got an AMD Turion @ 2000MHz and a P3 @ 533MHz. So I started some LLs on P3 while Turion was testing other exponents. Then, I passed P3 exponents on Turion and completed them. I don't know if my "technique" was correct (and I'd like to know if it is), but this thing allowed me to exploit an old cpu for latest LL tests and not TF'ing. Anyway I had no internet on the old pc, otherwise you could have seen strange expected completion dates too, especially if I had used a slower (than P3) cpu.

garo 2014-11-16 12:52

Here is an idea. if you are really interested in monitoring the stragglers, why not create an SQL table that records every checkin with % complete. Graph that like GP2 used to a decade ago and you'll know what is abandoned.

Primenet is supposed to release exponents that have not reported for 60 days but that doesn't always work.

TheMawn 2014-11-16 16:19

What happened to LL's which were assigned before GPU's came in an changed the optimal factoring levels?

Madpoo 2014-11-16 20:22

[QUOTE=TheMawn;387789]What happened to LL's which were assigned before GPU's came in an changed the optimal factoring levels?[/QUOTE]

As I understand it, if an exponent is assigned for LL, it won't be re-assigned for TF even if the optimal TF depth has changed since that original assignment.

That's not to say that some folks haven't (apparently) gone through and found some of those and done some extra TF work anyway... There are active LL assignments where a factor was reported in by someone else in the interim.

Technically it looks like the server marks an LL assignment as "expired" when someone else reports a factor for it, but the client doesn't know that and will keep plugging away. I don't know, maybe there's something in the process that alerts a client that a factor was found the next time it checks in, so it won't waste it's time on the LL check... I would hope so anyway.

I'm not sure though... for example, there are 59 LL (first or DC) assignments in the database where the client last checked in *after* the assignment expired, and a factor has been found.

Consider this example:
[URL="http://www.mersenne.org/report_exponent/?exp_lo=60721279&exp_hi=&full=1"]http://www.mersenne.org/report_exponent/?exp_lo=60721279&exp_hi=&full=1[/URL]

A factor was found on July 14, and the original LL assignment made in January was expired, but the client continues to check itself in, working on it (it's 9.1% done, as of today, a few hours ago when it last reported in).

Now, if you go to the active assignments page, you won't see it, because that LL assignment is indeed marked as expired, but like I mentioned, the client apparently doesn't know that:
[URL="http://www.mersenne.org/assignments/?exp_lo=60721279"]http://www.mersenne.org/assignments/?exp_lo=60721279[/URL]

The factor was found in the 73-74 bit range by a user running mfaktc...unassigned (poached?). It had also been previously TF'd from 70-73 bit depth by another GPU user, unassigned.

Should we name and shame the TF "poachers"? LOL

(PS - apologies in advance to UncWilly... I'm expecting an "off topic" post any moment now on this...) :smile:

Mark Rose 2014-11-16 20:57

[QUOTE=Madpoo;387814]Now, if you go to the active assignments page, you won't see it, because that LL assignment is indeed marked as expired, but like I mentioned, the client apparently doesn't know that:
[URL="http://www.mersenne.org/assignments/?exp_lo=60721279"]http://www.mersenne.org/assignments/?exp_lo=60721279[/URL]

The factor was found in the 73-74 bit range by a user running mfaktc...unassigned (poached?). It had also been previously TF'd from 70-73 bit depth by another GPU user, unassigned.

Should we name and shame the TF "poachers"? LOL[/QUOTE]

The user in question is a new and [url=http://www.gpu72.com/reports/worker/4baa87071ecc32e0e91a44923d4abae0/]heavy throughput[/url] user at GPU72. I don't think it was a case of intentional poaching.

srow7 2014-11-16 21:37

[QUOTE]The user in question is a new and heavy throughput user at GPU72. I don't think it was a case of intentional poaching.[/QUOTE]
I did not poach
I work only on what misfit/gpu72.com gives me
my setup is 100% automated.

chalsall 2014-11-16 21:58

[QUOTE=Madpoo;387814]Should we name and shame the TF "poachers"? LOL[/QUOTE]

I know this was meant in good humour, but just for the record... 60721279 was legitimately owned by GPU72, and given to srow7 for TF'ing. srow7 didn't poach, and GPU72 didn't make a mistake.

The records shown by Primenet on that particular report don't indicate that Qwerty's assignment of 60721279 was converted from the LL assignment which timed out to be a DC assignment.

It is the latter (DC) assignment which fully expired after srow7 found the factor. Unfortunately Prime95/mprime don't appear to understand "bad assignment key" messages once the processing of the assignment is underway.

[CODE]60721279 LL 0 54 2014-02-28 2014-01-06 2014-01-05 2014-01-05 qwertyfly QwertyBox
60721279 LL LL, 3.10% 13 44 2014-03-03 2014-01-09 2014-01-08 2014-01-05 qwertyfly QwertyBox
60721279 LL LL, 3.10% 56 1 2014-03-03 2014-01-09 2014-01-08 2014-01-05 qwertyfly QwertyBox
60721279 LL 2 2014-08-26 2014-03-09 2014-03-09 wabbit Manual testing
60721279 LL 69 2014-08-26 2014-03-09 2014-03-09 wabbit Manual testing
60721279 LL 2 2014-11-09 2014-05-23 2014-05-23 wabbit Manual testing
60721279 LL 40 2014-11-09 2014-05-23 2014-05-23 wabbit Manual testing[/CODE]

Madpoo 2014-11-16 22:18

[QUOTE=srow7;387825]I did not poach
I work only on what misfit/gpu72.com gives me
my setup is 100% automated.[/QUOTE]

Sorry, no offense meant. I realized later that the work being done was probably automatically generated from some lists or another, maybe GPU72 handing them out.

kladner 2014-11-16 22:45

[QUOTE=Mark Rose;387823]The user in question is a new and [URL="http://www.gpu72.com/reports/worker/4baa87071ecc32e0e91a44923d4abae0/"]heavy throughput[/URL] user at GPU72.....[/QUOTE]

Wow! No kidding! ~1600 GHzD/D is nothing to sneeze at.

Thanks, srow7, for bringing that kind of power to the project. :tu:

Madpoo 2014-11-16 22:55

[QUOTE=chalsall;387827]I know this was meant in good humour, but just for the record... 60721279 was legitimately owned by GPU72, and given to srow7 for TF'ing. srow7 didn't poach, and GPU72 didn't make a mistake.[/QUOTE]

Correct, it was meant to be a joke... I'm the *last* person who should throw stones... believe me.

I do wonder how GPU72 handed out a TF assignment for an exponent that was assigned to someone else for LL. I'm assuming something didn't go right because I have a feeling GPU72 respects existing assignments... doesn't it use Primenet itself to get the work anyway? So it could have been something on the mersenne.org side that handed that particular one out again for some reason. I don't know...

There are other examples though, so maybe there's some common thread to them all that would explain how it happened and keep it from happening again.

It's kind of nice that factors are found, and if only the client doing the LL test were aware of it, it would *still* save some CPU time, because the LL assignment could stop and move on to another one, but at the sacrifice of losing any "credit" for the work done so far.

If there was some consensus that exponents with an active LL assignments shouldn't be handed out to someone else for deeper TF work, then we'd just want to see what it takes to keep that from happening.

I don't know if it matters or not, but the one example I gave was for a "grandfathered" assignment, before the new recycling rules? Might be irrelevant to this, but in case it mattered.

**** Back on topic ****

I noticed there's just one more 1st time check for an exponent below 52M. I guess that means 2 of those last 3 were either done by the original assignee or someone else snagged 'em. Either way, down to 1 more for that mini milestone.

Mark Rose 2014-11-17 01:02

[QUOTE=kladner;387831]Wow! No kidding! ~1600 GHzD/D is nothing to sneeze at.

Thanks, srow7, for bringing that kind of power to the project. :tu:[/QUOTE]

Makes me want to go out and buy another couple GTX 580's and a box to put them in :D

Madpoo 2014-11-17 03:37

[QUOTE=Madpoo;387832]I noticed there's just one more 1st time check for an exponent below 52M. I guess that means 2 of those last 3 were either done by the original assignee or someone else snagged 'em. Either way, down to 1 more for that mini milestone.[/QUOTE]

Oh, you know, I think we were looking at exponents in the 52M-54M range, not just to 53M. So there are still those 3 left.

I think I need a timeout. :) Or just stick my nose back into tweaking some reports or getting XML dumps of the daily results. :smile:

retina 2014-11-17 04:56

[QUOTE=Madpoo;387849]Oh, you know, I think we were looking at exponents in the 52M-54M range, not just to 53M. So there are still those 3 left.[/QUOTE]I was wondering what you were looking at. Those magic mushrooms will do that every time. :cmd:

Primeinator 2014-11-17 05:06

This
[QUOTE]Countdown to double-checking all 2P-1 smaller than 10M digits: 63 (Estimated completion : 2015-02-20)[/QUOTE]

seems to be falling quite rapidly all of a sudden.

Edit: A lot of the ones remaining are by the same person- GrunwalderGIMP and are all quite a bit overdue even though the assignments are not that old.

chalsall 2014-11-17 13:47

[QUOTE=Madpoo;387832]So it could have been something on the mersenne.org side that handed that particular one out again for some reason. I don't know...[/QUOTE]

I tried to explain what happened.. The assignment was "recycled" by Primenet, and the candidate was then available for LL assignment; but, importantly, the original assignment was still in the Primenet database, but as a DC assignment. GPU72 then (legitimately) picked it up, and gave it to srow7 to TF.

When srow7 reported the factor, Primenet then canceled the DC assignment.

Madpoo 2014-11-17 17:32

[QUOTE=chalsall;387874]I tried to explain what happened.. The assignment was "recycled" by Primenet, and the candidate was then available for LL assignment; but, importantly, the original assignment was still in the Primenet database, but as a DC assignment. GPU72 then (legitimately) picked it up, and gave it to srow7 to TF.

When srow7 reported the factor, Primenet then canceled the DC assignment.[/QUOTE]

Well, I won't pretend I know what happened with any earlier assignments... the Primenet DB doesn't show all historical assignments, necessarily... once an assignment reports in as usual, the assignment is cleared. The only "historical" assignment records are the ones where it expired for whatever reason, before completion. My only point was that it was assigned to someone for LL work (first or DC didn't seem to matter), and during that period when it was assigned to someone and being worked on, a TF result was checked in. The assignment on it for LL work was in January, so it should have been operating under the grandfathered rules, and in fact the server hasn't expired it for any other reason than that a factor was checked in by someone else... it wasn't expired due to recycling rules.

If you're saying Primenet assigned that TF work out even though there was an active LL/DC assignment, I believe you, I really do. :smile: I just didn't think the server would assign the same exponent twice, even if one was for LL/DC work and one was for TF/ECM/P-1. The *only* time an exponent can be assigned to multiple people is for ECM work, as far as I can tell.

My main point is that it seems counter-productive, no matter how it happened, and we would probably want to make sure those kinds of things don't crop up too often?

In my defense, the server did in fact mark that LL assignment as "poached" using it's own internal flagging codes... assignments can be expired for 4 different reasons:
Poached (someone else checked in a result that supersedes this assignment...either another double-check or a factor-found)
No Contact (expired because the client didn't check in in the timeframe required, 60 days, *and* it's 10 days past due)
Assignment Rules (expired due to the recycling rules in effect)
Manually Extended (this is kind of a special case... the assignment was manually extended so it's in a kind of limbo for an extra xx days)

The "poached" definition is only saying that Primenet had it assigned to someone, and someone else checked in a result... that's all. I suspect there must have been some confusion going on if Primenet handed that assignment out to someone else while it was still assigned to someone, and it was definitely a first-time LL assignment (nobody else has checked in an LL result for it), so I'm not sure why you show it got converted to a DC at some point? That would only be the case if a first-time LL assignment was out there, and someone else checked in a result first. That original LL assignment is "converted" to a DC just by definition, although a peek in the database would still show it was originally assigned as a first time check.

Sorry to go into such detail... maybe the extra info will be helpful in peeking under the cover at how the machine works. :)

Prime95 2014-11-17 19:13

I think Chris is saying the original LL was recycled. Since the exponent is grandfathered and not in the critical range, I think the only reason it would expire is due to no contact for 60 days. The server would then downgrade the assignment to a DC marked with the internal expired due to "no contact" code.

Chris/GPU72 then grabbed the LL assignment and sent it out for more factoring. When a factor was found, the server then overwrote the internal "no contact" code with the "poached" code. I suppose, we could add a new internal code: "expired and then rendered unnecessary", but seems like a lot of bother to me.

If the above is what actually happened, then everything took place as expected.

chalsall 2014-11-17 19:41

[QUOTE=Prime95;387895]If the above is what actually happened, then everything took place as expected.[/QUOTE]

The above is what happened.

Madpoo 2014-11-17 21:55

[QUOTE=chalsall;387898]The above is what happened.[/QUOTE]

I surrender. :smile: I will freely admit I was wrong to even suspect GPU72 or srow7 of doing anything wrong.

I think this is just one of those crazy cases where an exponent was thought to be abandoned and thus recycled, but then the user started work on it anyway at some later point.

I was curious how often that happens, and it seems to happen with some frequency?

For example, I just queried how many assignments (LL and DC) are out there where:
a) The assignment is expired
b) The user has updated the assignment 30+ days after the expiration
c) That update from the user was at some point after Oct. 1 2014 (this year)

5042 such assignments.

Of those 5042 that are still curiously chugging away on their work, 28 of them have had factors found.

Those 28 are the odd ones because whatever reason for their original expiration (I assume it was a no-contact or the general recycling rules), once a factor was found by a subsequent assignment, like what we were talking about, that original expiration reason was updated to show it was "poached" (but not really).

Even if I expand my search criteria to exponents that were updated *60* or more days since expiration, to account for machines that are just really out of touch or don't get online that often, there are still 4355 of them and 22 with a known factor.

I guess I shouldn't worry about it, but then I think of all those clients out there chugging away on an LL assignment, and unbeknownst to it, the server is already aware that it's composite.

At least with the other ones where no factor has been found, once they do (if ever) report in, it'll still be useful for a first, second or sometimes third LL test.

As I understand it, Primenet will see these "expired" assignments check in, and I think as long as work has already started on it, there's nothing built in to tell the client to stop even if a factor is known.

I'm pretty sure that's by design... I remember back when I started poaching some exponents (long ago, when it was probably a lot more controversial and I was more of a doofus) I would start work on some poached assignment "offline", let it run a few iterations, and then transfer it to an "online" Primenet connected machine. At that point Primenet would no longer unassign any poached exponents and remove them from my worktodo like it would if work hadn't started on it yet. :smile:

It'll probably happen some more... some of those "active but also expired" exponents in the 40M-50M range have only been TF'd to 69 bits so I imagine they'll get processed by GPU72, maybe, for extra TF work (sorry, I don't know what the GPU72 TF cutoffs are for different ranges).

And again, apologies to Chris and srow7 ... mea culpa.

chalsall 2014-11-17 22:03

[QUOTE=Madpoo;387914]And again, apologies to Chris and srow7 ... mea culpa.[/QUOTE]

Hey no problem.

We're all rather intense around here; frankly it's nice being able to be intense in a space without immediately being accused of being mean. :smile:

Madpoo 2014-11-17 22:22

[QUOTE=chalsall;387919]Hey no problem.

We're all rather intense around here; frankly it's nice being able to be intense in a space without immediately being accused of being mean. :smile:[/QUOTE]

Whatever, jerkface. :smile: (yes, I had to reach WAY down deep into my inner 7-year old for that gem of an insult... and by the way, I'm rubber and you're glue...)

EDIT: in case anyone is confused, YES, I'm kidding around :)

LaurV 2014-11-18 02:35

[QUOTE=Madpoo;387914]I was wrong to even suspect GPU72 or srow7 of doing anything wrong.[/QUOTE]
This time...
:razz:

ATH 2014-11-18 05:31

[QUOTE=Madpoo;387914]I guess I shouldn't worry about it, but then I think of all those clients out there chugging away on an LL assignment, and unbeknownst to it, the server is already aware that it's composite.[/QUOTE]

If they did not report in for 30 or 60 days and then starts working (very slow I guess) on the exponent, then apparently they do not really care about it. Most of them do not even know they are still running Prime95, the running applications are hidden by that damn little white arrow nowadays unless you turn it off, which most "standard" users do not know how to do or that they need to.

Think of all the lost performance and used RAM by all the "shit" every application leaves running in the memory and all the damn browser toolbars. I have been helping so many "standard" users with their computers over the years and have seen it all.

Madpoo 2014-11-18 18:39

[QUOTE=ATH;387958]If they did not report in for 30 or 60 days and then starts working (very slow I guess) on the exponent, then apparently they do not really care about it. Most of them do not even know they are still running Prime95, the running applications are hidden by that damn little white arrow nowadays unless you turn it off, which most "standard" users do not know how to do or that they need to.[/QUOTE]

Who knows... it's weird though.

I was experimenting yesterday with modifying the assignments page to show expired assignments as well... it's interesting to see just how many there are in certain ranges, but overall not that useful. And I can see in the table there the same thing I was mentioning, where an assignment is expired but the client keeps checking in results.

Oh well though... like I said, most of that work will still be useful, assuming they finally finish. It may be reassigned to someone else for a "first time check" and then it's just a race over which one is first, which one is the double-check.

I guess if one of these happened to be the next Mersenne Prime there could be some discussion there... whoever tested it first would get the credit, but the original assignee might be upset. But hey, it is what it is, the rules are what they are.

An example is this one: [URL="http://www.mersenne.org/M53042401"]http://www.mersenne.org/M53042401[/URL]

Assignment expired on Oct. 22 and was reassigned to someone else, but the original assignee has continued to check in results on a daily basis, as recently as this AM. The original user is at 71% done, but the new assignee is already at 55% and will probably finish first.

Don't know what to say though... that original user got the assignment back in April of 2013, 593 days ago. It could well be argued that if you can't complete an LL assignment in the 53M range in over a year, you're not trying hard enough and the recycling makes sense. This user was given a year and a half before it recycled, so I don't feel bad for them. At best it means that assignment gets a double-check farther ahead of the current DC assignments. At worst (for that user) it's the next prime and they missed their shot at glory. :)

Prime95 2014-11-18 18:48

[QUOTE=Madpoo;387997]
I was experimenting yesterday with modifying the assignments page to show expired assignments as well... it's interesting to see just how many there are in certain ranges, but overall not that useful. [/QUOTE]

The database has only kept the expired assignment info since the new assignment rules went into effect 9 months ago. I would expect to see most of the "expired but still checking in" exponents in the cat 1 DC and cat 1 LL area or low 30Ms and low 50Ms. But as you pointed out they had well over a year to reach a result. I'd bet most of these people don't even know prime95 is running on the computer.

Luis 2014-11-18 21:28

[QUOTE=Madpoo;387997]I guess if one of these happened to be the next Mersenne Prime there could be some discussion there... whoever tested it first would get the credit, but [B]the original assignee might be upset[/B]. But hey, it is what it is, the rules are what they are.[/QUOTE]
About upsetting I'm the one who will never unreserve an exponent, manually. :smile:


What's the probability that there is a Mersenne prime between 2^(51,907,363)-1 and 2^(57,885,161)-2 (I'm excluding the current M48)? Looking at [URL="http://www.mersenne.org/primes/"]the known Mersenne prime distribution[/URL] it doesn't seem impossible at all.
Poaching would reward 3000$ this time. :rolleyes:

TheMawn 2014-11-18 23:04

The distribution through time is kind of silly. The discovery date is irrelevant and almost even detrimental to any (futile as it may be) insight into the likelihood of finding another.

LaurV 2014-11-19 05:58

[QUOTE=Luis;388011]What's the probability that there is a Mersenne prime between 2^(51,907,363)-1 and 2^(57,885,161)-[COLOR=Red]2[/COLOR] (I'm excluding the current M48)?
[/QUOTE]
Huh?? :shock: how did you get that, by excluding some number in the middle of the ocean?

[QUOTE]
Poaching would reward 3000$ this time. :rolleyes:[/QUOTE]
You may be shocked to learn that in this case the money may still go to the original (legal) assignee, or not go anywhere at all.. This is first to discourage poaching. Read the former discussions here around, and Gimps' disclaimer about the money (also, no money until EFF pays its slice, which may take 10 years or so, till a 100M decimal digits prime is found). If you are here to make money you will be very disappointed. :smile:

Luis 2014-11-19 14:25

I'm not a poacher and I've a respectable position in Top Producers. I think being the discoverer is priceless too. :wink:

About my question there is no mathematical evidence. Just imagining that 'hole' in the distribution and many first time LL tests to go could hide a Mersenne prime. Anyway: not impossible != probable.

NBtarheel_33 2014-11-19 15:09

November 19, 2014. All exponents below [B]52[/B] million have been tested at least once.

Madpoo 2014-11-19 18:10

[QUOTE=NBtarheel_33;388052]November 19, 2014. All exponents below [B]52[/B] million have been tested at least once.[/QUOTE]

Cool, we'll have to update that page now. All under 53M have been checked in fact (those last 3 stragglers).

Umm... it bears mentioning that all 3 of those were poached.

One of them, 51907363, was making steady progress and being updated daily with an ETA of Dec 1. The other 2 in the 52M range had last checked in 6 days ago, and were 81% done, with ETAs of Dec 10 and 11, so they weren't really abandoned either.

None were prime, and when the original assignees check in their results, hopefully the residues match and they'll be good double-checks, but anyway... there it is. :smile:

chalsall 2014-11-19 18:26

[QUOTE=Madpoo;388063]Cool, we'll have to update that page now. All under 53M have been checked in fact (those last 3 stragglers).[/QUOTE]

Yup.

[QUOTE=Madpoo;388063]Umm... it bears mentioning that all 3 of those were poached.[/QUOTE]

Yup. By me. Personally. As previously announced and then (sorta) generally agreed apon.

[QUOTE=Madpoo;388063]One of them, 51907363, was making steady progress and being updated daily with an ETA of Dec 1. The other 2 in the 52M range had last checked in 6 days ago, and were 81% done, with ETAs of Dec 10 and 11, so they weren't really abandoned either.[/QUOTE]

Cool.

Then they'll get the appropriate credit for the DC (or, maybe, the TC) residue they finally submit in a few years (unless, of course, a factor is found).

lycorn 2014-11-19 19:34

Although I´ve never been affected by poaching (nor have I ever done it), it´s something that really kind of bugs me.
I appreciate that the rules had to be changed, because as they stood, many exponents were just "begging to be poached".
Seeing the progress of milestones systematically blocked by stragglers, many of which should have been released a long time ago, was more than many of us could stand.
But now that the new rules are in place, I don´t see anymore an "excuse" for poaching. What can we gain from clearing a milestone a couple of days/weeks earlier, knowing that it will be cleared sooner than later, due to the new rules? Gone are the times when we could not say if/when they would be eventually cleared.
I was quite displeased by the poaching of these 3 exponents, that were making a steady progress and approaching completion at a regular pace. I really don´t understand the motivation for doing this in such circumstances, apart from an unjustified impatience, or some desire of being noticed.
That´s due to this kind of things that I still advocate that poached results should be simply refused by the server.

Luis 2014-11-19 19:55

[QUOTE=TheMawn;388016]The distribution through time is kind of silly. The discovery date is irrelevant and almost even detrimental to any (futile as it may be) insight into the likelihood of finding another.[/QUOTE]
Noooo! I didn't mean distribution through time. I meant the exponents' distance, if I could call so. Between M38 and M39 it's 6,494,324, between M39 and M40 it's 7,529,094, then 8 Mersenne primes with avg distance 3,159,514 and so between M47 and M48 it's 14,772,552! Suspect, but nothing more than an (not mathematical) observation, maybe stupid, but just curious.

chalsall 2014-11-19 20:02

[QUOTE=lycorn;388070]But now that the new rules are in place, I don´t see anymore an "excuse" for poaching. What can we gain from clearing a milestone a couple of days/weeks earlier, knowing that it will be cleared sooner than later, due to the new rules? Gone are the times when we could not say if/when they would be eventually cleared.

I was quite displeased by the poaching of these 3 exponents, that were making a steady progress and approaching completion at a regular pace. I really don´t understand the motivation for doing this in such circumstances, apart from an unjustified impatience, or some desire of being noticed.[/QUOTE]

As the "poacher", please let me defend myself...

1. From the "instantainious" view from Primenet, these three cadidates /appeared/ to be making progress.

2. From a more temporally spread view of the same report (which I have access to because of my spiders), it was clear that these three candidates would take /much/ more time to actually complete than allowed under the current (implemented) Primenet recycling rules.

3. So, then, I gave notice of my intent to poach, waited for a strong objection, and then loaded them up.

[QUOTE=lycorn;388070]That´s due to this kind of things that I still advocate that poached results should be simply refused by the server.[/QUOTE]

Personally, I'd be very happy with that. I said before that if the straggler turns out to be a MP, the "poached" should be credited.

Edit: Sorry, I misread you. I wouldn't be happy with the result being rejected. I'd be happy with the "poached" being given the credit for the work, even if a MP.

Madpoo 2014-11-19 20:34

[QUOTE=chalsall;388064]
Then they'll get the appropriate credit for the DC (or, maybe, the TC) residue they finally submit in a few years (unless, of course, a factor is found).[/QUOTE]

I don't really care, personally.

If you like, you could check the other 324 exponents that are below 56M for single-checks. :) Well, except 6 of those which, by some odd luck, are assigned to me. And yes, they're being worked on at a good pace. :smile:

manfred4 2014-11-19 21:28

When every Assignment under the old Rules finally is expired, nobody will have the need of poaching those small exponents anymore. But until then please don't poach the current smallest, without intention I got that one when it was released back 10 days ago :)
The current lowest number will therefore rise again soon...

chalsall 2014-11-19 21:43

[QUOTE=manfred4;388079]But until then please don't poach the current smallest, without intention I got that one when it was released back 10 days ago :)[/QUOTE]

Are you "NR" by any chance? If so, 53014301 is still yours. I have no intention to "eat it, thinking it were a carrot...".

I also don't generally agree with "poaching" and take such actions carefully, and when possible with general concensus.

Primeinator 2014-11-19 23:01

[QUOTE=chalsall;388080] I have no intention to "eat it, thinking it were a carrot...".

[/QUOTE]

What if it came with red pepper roasted hummus?

Brian-E 2014-11-19 23:42

[QUOTE=chalsall;388074]As the "poacher", please let me defend myself...

1. From the "instantainious" view from Primenet, these three cadidates /appeared/ to be making progress.

2. From a more temporally spread view of the same report (which I have access to because of my spiders), it was clear that these three candidates would take /much/ more time to actually complete than allowed under the current (implemented) Primenet recycling rules.

3. So, then, I gave notice of my intent to poach, waited for a strong objection, and then loaded them up.



Personally, I'd be very happy with that. I said before that if the straggler turns out to be a MP, the "poached" should be credited.

Edit: Sorry, I misread you. I wouldn't be happy with the result being rejected. I'd be happy with the "poached" being given the credit for the work, even if a MP.[/QUOTE]
As regards your step number 3, there will be various reasons for those of us who strongly object for not actually speaking up when you wrote that. In my case it boils down to a lack of self confidence related to the fact that I am an extremely tiny contributor to the project, plus a reluctance to spark another tedious argument rehashing lines of discussion which have been repeated ad nauseam.

My favoured approach to your poaching of the last three assignments would have been for PrimeNet to have placed your results on standby without showing them under database enquiries, to have waited for the assignees to finish the work or for the assignments to have expired, and only then to have released your results for general viewing (as DC results in the case of the original assignees finishing the work).

[QUOTE=chalsall;388080][...]I also don't generally agree with "poaching" and take such actions carefully, and when possible with general concensus.[/QUOTE]
General consensus, unfortunately, is never possible.

chalsall 2014-11-20 00:15

[QUOTE=Brian-E;388091]General consensus, unfortunately, is never possible.[/QUOTE]

Agreed.

This is why some take actions being comfortable within their own skin.

ATH 2014-11-20 05:15

[QUOTE=Madpoo;388063]Umm... it bears mentioning that all 3 of those were poached.

One of them, 51907363, was making steady progress and being updated daily with an ETA of Dec 1. The other 2 in the 52M range had last checked in 6 days ago, and were 81% done, with ETAs of Dec 10 and 11, so they weren't really abandoned either.
[/QUOTE]

[QUOTE=lycorn;388070]I was quite displeased by the poaching of these 3 exponents, that were making a steady progress and approaching completion at a regular pace. I really don´t understand the motivation for doing this in such circumstances, apart from an unjustified impatience, or some desire of being noticed.[/QUOTE]

Did you both miss the discussion we had 5-7 days ago in this thread about those 3 exponents?

I do not understand why primenet and you would think that if an exponent has taken 576 days to reach 81% (0.140625% per day) why would it suddenly complete the last 19% in just ~ 22 days (0.863636% per day). Those ETA's makes no sense. George posted the SQL code for recycling and 51907363 was due for recycling Nov 19th or 20th, and the 2 others around ~ Dec 9th. Also Chris had his own statistics that showed all 3 would not be complete for ages:
[URL="http://www.mersenneforum.org/showpost.php?p=387721&postcount=1494"]http://www.mersenneforum.org/showpost.php?p=387721&postcount=1494[/URL]

lycorn 2014-11-20 07:50

I missed those details, indeed.
Although I appreciate that the matter was pondered, so it wasn´t just an "impulse", I still advocate that we should abide by the rules we agreed as a community regarding the recycling of exponents, even though sometimes it really seems like we are not doing any harm in poaching.
That said, I owe an apology to chalsall: in the light of the post quoted by ATH I acknowledge that "[I]an unjustified impatience, or some desire of being noticed"[/I] was a bit too strong. No hard feelings?

ATH 2014-11-20 09:23

Can the ETA calculation for those very slow exponent be improved? Maybe based on the average speed during the last say 3 reports to the server?:

ETA date = date of last report to server + (100% - last report progress)/("amount of progress since 3rd-last report"/days since 3rd last report)



For example for 52957519 and 52983583 they had according to Chris ~10% progress since 2014-05-17, so lets say that was their 3rd last report, and last report was roughly Nov 13th at 81%:

ETA date = 2014-11-13 + (100%-81%) / (10%/180 days) = 2015 Oct 21st.

chalsall 2014-11-20 13:31

[QUOTE=lycorn;388107]No hard feelings?[/QUOTE]

None at all; no problem. :smile:

Madpoo 2014-11-21 05:27

[QUOTE=ATH;388101]I do not understand why primenet and you would think that if an exponent has taken 576 days to reach 81% (0.140625% per day) why would it suddenly complete the last 19% in just ~ 22 days (0.863636% per day). Those ETA's makes no sense.[/QUOTE]

They may have been assigned, but behind some other work on the same machine. I forget what the default and max settings are for how much work can be queued up in the worktodo file.

George had also mentioned that there is a limit on the rolling average of "500" (50% of the base 1000) or whatever... well, I may have mangled that, but I understood that essentially there's a low end value on that where it assumes a machine will at least be on 12 hours a day running at peak efficiency. If a machine is only turned on and running P95 maybe 2-4 hours a day, its estimates are going to be wildly off because it's still assuming 12 hours of work per day.

Or maybe some folks could be running P95 on a heavily burdened system, and P95 is only getting minimal CPU cycles, so the same lower threshold being reached and breached is resulting in those inaccurate estimates.

The end result, I guess, is that each time it reports in (which is daily in many cases), it reports very small progress from the day before, and keeps moving the ETA further and further out. At first glance it would seem weird and suspicious, like... "what is that person trying to pull?" but I think it's nothing more than a machine running at < 50% efficiency... sometimes probably by quite a bit.

A machine where the rolling average is more like 25% is going to be off by twice as many days as it's original estimate... and since the "days of work to queue" is also basing it's decision to return work that is out too far on that rolling average, it could very well result in too much work being queued up than it really ought to.

So... that's my speculation on how/why it happens.

The grandfathered exponents will run their course before too long though. It's more pronounced with those assignments made prior to 2014-03-01. I think the assignments made after are less tolerant of that kind of accidental tomfoolery. :smile: The lack of progress after so many days from being assigned will recycle it MUCH sooner. Clients that don't pay attention could still start working on it at some point, but that's kind of their problem, ya know? Or maybe the client will see an expired assignment when it checks in, and if work hasn't begun it gets removed automatically?

Madpoo 2014-11-21 06:12

Reddit on the M44 announcement
 
I ran across this link when looking at traffic to the site recently:
[URL="http://www.reddit.com/r/math/comments/2lumlm/m32582657_confirmed_to_be_the_44th_mersenne_prime/"]http://www.reddit.com/r/math/comments/2lumlm/m32582657_confirmed_to_be_the_44th_mersenne_prime/[/URL]

I thought it was mildly interesting (and a little amusing at some of the misunderstandings about what the announcement really was saying).

tha 2014-11-21 10:07

[QUOTE=chalsall;388074]As the "poacher", please let me defend myself...

2. From a more temporally spread view of the same report (which I have access to because of my spiders), it was clear that these three candidates would take /much/ more time to actually complete than allowed under the current (implemented) Primenet recycling rules.
[/QUOTE]

I believe Chris is threading carefully, but I just couldn't help thinking about this joke:
[URL="https://m.youtube.com/watch?v=FIrYci5TZiU"]Ronald Reagan on the Soviet Union[/URL]


All times are UTC. The time now is 21:12.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.