![]() |
[QUOTE=tha;388153]... but I just couldn't help thinking about this joke:
[URL="https://m.youtube.com/watch?v=FIrYci5TZiU"]Ronald Reagan on the Soviet Union[/URL][/QUOTE] That was a joke? |
[QUOTE=chalsall;388196]That was a joke?[/QUOTE]
Let us hope so, else he shot his friend for real. In these days of Putin we may recycle these jokes, but he doesn't seem like the guy with the best sense of humor though. |
A soonish milestone:
[code]Countdown to testing all exponents below M(57885161) once:[I] 5,624[/I][/code]Is it likely a new MP will be found before this one is "one-time" proved? |
[QUOTE=davar55;388236]Is it likely a new MP will be found before this one is "one-time" proved?[/QUOTE]
I would say that the answer is yes. Based upon our past history, we are past the average time to discover the next MP. The average is 455 days between, we are now at 666 days since the last. If you take our average gap and add the average deviation from that (not the standard dev.), it is less than 3 months. It will be interesting to see what George's change to the assignment rules have on the progress of the wavefront. |
[QUOTE=davar55;388236]A soonish milestone:
[code]Countdown to testing all exponents below M(57885161) once:[I] 5,624[/I][/code]Is it likely a new MP will be found before this one is "one-time" proved?[/QUOTE] There's even some chance the next one found will be smaller than that one. :) Double-checking is still years away from proving M48 and not all the smaller exponents have even been first-time checked. What's more, of the ones that *have* been first-time checked, I just looked and 82 of those results were marked "suspect" due to some error codes during the LL check. 21 of those 82 'suspect' results are assigned to someone else currently, but that's another 61 exponents where the first check was suspect, and *probably* bad (they're all in the 56M-57M range). So, hey... who knows. :smile: Maybe the dry spell is because wherever that little gem is, it's still hiding because it got passed over during some bad run, and it's waiting for double/triple checking. I think George might have it setup to hand out exponents again as a "new" first time check if the original run was suspect... I don't know how else to explain the 21 of those that are already re-assigned. Which makes sense... I'd hate to wait around for double-checking to realize that one of those was actually prime, which could be years. |
[QUOTE=davar55;388236]Is it likely a new MP will be found before this one is "one-time" proved?[/QUOTE]
I would say probably not. Davieddy was good at bringing us monthly updates on the expected time to next prime. Back then, the estimate was consistently in the 4 years area. That means at our current rate we expect to find one prime over the next four years. Could be more primes, but could be less. Note that the fact that we haven't found one in the last year and a half is irrelevant. We expect 1 in the next four years. If we don't find a prime before 2018, we will still expect one prime the following 4 years (2019-2022). All the above assumes, computer power and number of users keeps up with the demands (and lower probabilities) of larger exponents. |
Is there any way to immediately send out work that has been returned with an error code?
|
[QUOTE=Xyzzy;388252]Is there any way to immediately send out work that has been returned with an error code?[/QUOTE]
The server already does do that as far as I understand it. I have three cores working on CAT 1 double checks. Quite some of the assignments are for triple checks even though the error codes are 0. |
[QUOTE=Xyzzy;388252]Is there any way to immediately send out work that has been returned with an error code?[/QUOTE]
My gut tells me it handles that, by basically ignoring a result if the error code isn't zero, for purposes of assignments. If the next check-in happens to match residues, then it's a verified LL test, no harm no foul. More often than not, that one with errors ends up being the loser in a triple-check though. The reason my gut tells me that is because exponents seem to be reassigned for LL or DC work well ahead of when it would if that original "suspect" result was counted as just a regular result. Also, a couple weeks back I saw a handful of old false positives... just to be sure I re-ran them all. George will do some manual interventions with suspicious seeming results (especially false positives), so we can rest assured, he's on the case. Just in case, I found 3 more of those false positives earlier today when looking at some #'s so I'm re-running those just to cross the t's and dot the i's. They're older ones too and I'm sure George already checked into them but I didn't see any verification results. Call it a mini case of OCD on my part that says "you know, let's just run those again, in case". :smile: |
[QUOTE]Countdown to double-checking all 2^P-1 smaller than 10M digits: 49[/QUOTE]
Under 50 to go! |
[QUOTE=cuBerBruce;388523]Under 50 to go![/QUOTE]
And about 3 months to go. Have patience grasshopper. |
[QUOTE=Madpoo;388257]...I found 3 more of those false positives earlier today when looking at some #'s so I'm re-running those just to cross the t's and dot the i's. They're older ones too and I'm sure George already checked into them but I didn't see any verification results..[/QUOTE]
Finished checking all 3... none were prime (of course), but now I feel better. :smile: |
So it appears that the most impatient one is "For Research" currently clearing all the low exponents.
It is also of note is that GrunwalderGIMP, who had many of their exponent "poached", eventually did check in good results a few days later; which are listed as a triple check. So all those impatient people are doing a lot of duplicate work. Cheesehead's 100.1 mph is now more like 99.9 mph. |
Yes, I think that listing these more minor steps on the Milestones page does motivate certain poachers. I would prefer to either not list them at all, or else in a less prominent place. Maybe after 10M double-checks are done and all exponents below 56M are first-time checked would be a good time to drop it. Sure, I expect a few poachers when we get close to first time checks below M57885161 being done, but at least that sort of a milestone doesn't come up very often.
|
You fail to mention if GrunwalderGIMP submitted these assignments within the year he was generously given. Chris went to great lengths to first check that there were only slight chances that the assignments were to be finished on time. You can't possibly deny that fact that 5% progress in 11 months (or whatever it was) looks less-than-promising.
An even bigger failure is your lack of protest when he publicly declared that he was going to poach these specific ones because they met his criteria for marooned assignments (EDIT: I should emphasize he did ask for objections). Hindsight is 20/20, but what about your foresight? Clearly not great as you would have said something otherwise, one would hope. For each assignment that would have finished, there many more which would still be sitting at less than 10%. Besides, for anyone actually following the fairly lengthy discussions we've had on trying to increase productivity at the low end, it should be clear that this won't be an issue much longer. |
[QUOTE=TheMawn;390241]You fail to mention if GrunwalderGIMP submitted these assignments within the year he was generously given. [/QUOTE]
These were new assignments that were recycled; they failed to communicate with the server for 60 days. The ForResearch account doesn't poach anything or the relevant exponents would not show up as assigned. |
[QUOTE=philmoore;390235]Yes, I think that listing these more minor steps on the Milestones page does motivate certain poachers. I would prefer to either not list them at all, or else in a less prominent place. Maybe after 10M double-checks are done and all exponents below 56M are first-time checked would be a good time to drop it. Sure, I expect a few poachers when we get close to first time checks below M57885161 being done, but at least that sort of a milestone doesn't come up very often.[/QUOTE]
Maybe when the 10M double-checks are done, I won't bother putting up another double-check milestone unless George sees some value. For the first time LL checks, I'm not personally concerned about poaching... if the original assignment eventually checks in, then it's still a good double-check and that's cool too. I guess there's still that one in a bazigagillion chance that someone poaches an assignment and it turns out to be prime... the person who poached it would rightly get the credit for discovery, I guess, but the original assignment, assuming they were actively working on it and it wasn't just abandoned, could be miffed about that. I'd compare it to some supermarket chain giving a prize to the 1 millionth customer. You're in line with your stuff and some guy cuts in front of you and he wins. Yeah, you'd be miffed. It'd be different if you were in line and then left to go do whatever else, and the guy who took your place in line won. You'd probably still be miffed but it's different since it was your fault for leaving. :smile: Anyway, long story short, I won't plan on adding another double-check milestone after this one. :) |
[QUOTE=Madpoo;390350][...]Anyway, long story short, I won't plan on adding another double-check milestone after this one. :)[/QUOTE]
Your distinction between poaching of first time tests and poaching of double-checks is valid from the point of view of wasted computation (provided that only one impatient person poaches the first-time test!), but it is by no means always true that someone who has their first time test poached and has the status of their work changed to a double check will feel fine with that. As a tiny contributor with a single-core, very slow, and part-time running machine, I have done just four first-time tests in my time here, and they each took about six months to complete. All four gave me the satisfaction of having my machine perform a calculation for the project which had never been done before. I would have been very annoyed and quite upset if they had been poached. That could easily have demotivated me to the extent that I stopped participating. Do you and George see any possible merit in an idea suggested a while back that poached results should be stored on PrimeNet when they are turned in but not made public until the assignee completes the test or the assignment expires, and in the case of first time tests the original assignee being recorded as the first-time tester and the poacher as the double checker? Milestones, too, could be published as if the poached results had not been received (again until the assignment gets completed or expires). I think this might demotivate poachers after a while when they see that their impatient stealing of other people's work has no effect on the milestones. |
<5,000
Countdown to testing all exponents below M(57885161) once: 4,994
|
I stiill defend the idea that poached results should be simply refused by the server, with an error code stating something like: "Number currently assigned to other user. Result not accepted", or whatever seen fit by the project leaders. There should be a warning somewhere in the Primenet pages about this, to deter people from crrunching exponents assigned to others.
My point is that now we have an agreed (and implemented) criterion for recycling exponents, so if we have recognized its merit, we should now stick to it. This includes not accepting the impatience of some members, as the duration of the assignments is now perfectly established and limited. Having a result refused is in my opinion much more demotivating for a poacher than having it changed from 1st time to DC (or even from DC to TC). I understand GW´s idea of accepting any result that may be of value for the project, but let´s not forget that, in the long run, accepting poached results may do more harm than good, by driving away some regular participants. My 2 cents... :smile: |
[QUOTE=lycorn;390371]My point is that now we have an agreed (and implemented) criterion for recycling exponents, so if we have recognized its merit, we should now stick to it. This includes not accepting the impatience of some members, as the duration of the assignments is now perfectly established and limited.[/QUOTE]
+1 |
The fact remains that any assignment that is poached in order to quickly meet a guideline is shortly going to be an assignment that has been handed to the fastest computers and will be finished in 60 days.
Once that happens, a lot of the poaching will disappear. The cases where our own "trusted" users poach will disappear entirely because, for one, it will be well known that progress is expected to be swift, and for two, simple inspection of the progression will reveal that the assignments are well on course to finish. With that being said, I like the idea of the poached results being held in limbo until the due date is up and then processed. If the poached assignment was handed in normally by its rightful owner, then the poacher is credited as a double or triple check accordingly. If the assignment is NOT handed in on time, the poacher is credited with the first or double check accordingly. Straight-up denying the result seems like a bit of a waste, too. Honestly, in the case where someone poaches an assignment and it is not handed in by when [B][I]the user promised they would[/I][/B] then I would personally be quite pleased that someone poached it. With the simple rules I stated above, poaching CANNOT harm anyone. |
[QUOTE=retina;390220]So it appears that the most impatient one is "For Research" currently clearing all the low exponents.[/QUOTE]
If I may, a few facts: 1. I, personally, do occationally "poach" candidates (as defined by some; read: currently assigned to another). 1.1. I do this very carefully. And only for those candidates which have very little to almost no chance of completing before being recycled, and are very close to being recycled. 2. Because of the "spiders" I have developed over the last two years, I perhaps have better "situational awareness" than most. 2.1. As stated before, I have no privilaged access to Primenet. 3. While the "For Research" account sometimes grabs recycled candidates for TF processing by GPU72 participants, GPU72 will never give out a candidate assigned to someone else to a GPU72 worker. 4. I have advocated (many times) before that "poached" work should be credited to the original assignee rather than simply rejected. I, personally, don't care about "credit". 5. In another few months, this issue will go away. 5.1. Until that time, those of us who understand George's SQL, and have some cycles, may take matters into their own hands. Deal with it. |
[QUOTE=petrw1;390363]Countdown to testing all exponents below M(57885161) once: 4,994[/QUOTE]
[QUOTE]First 5000 assignments: Exponents below 58073190[/QUOTE] The Cat1/Cat2 boundary has leaped past M48, the "WR prime". |
[QUOTE=cuBerBruce;390428]The Cat1/Cat2 boundary has leaped past M48, the "WR prime".[/QUOTE]
I think this is sort of exciting. With the new rules kicking in fairly shortly, it means that we will soon have reasonable confidence in stating that M48 is in fact M48 (or M49 if we get lucky before then :razz: maybe two Mersenne Primes generated from two consecutive prime numbers even?) |
[QUOTE=chalsall;390398]5. In another few months, this issue will go away.
[/QUOTE] I think that was pretty much George's stance last time this was discussed recently. There's still a gaggle of grandfathered assignments out there that stick out like a turd in a punchbowl in terms of how long they have left, the current progress, how long past due they are, etc. I'm talking about exponents that were assigned in 2013, haven't even been started yet, but they're being "checked in" regularly with updated times of completion (which can be a year or more away in some cases). Once those few assignments are out of the way, I think the hunting grounds for poaching will be much smaller and it really shouldn't come up too much. |
56M 100
Countdown to first time checking all exponents below 56M: 100
|
[QUOTE=TheMawn;390429]maybe two Mersenne Primes generated from two consecutive prime numbers even?)[/QUOTE]
The currently known chains of consecutive primes that are all Mersenne prime exponents are: { 2, 3, 5, 7 } { 13, 17, 19 } All adjacent primes to the 48 known Mersenne prime exponents have had the status of the corresponding Mersenne number proven except for 57,885,143 and 57,885,167. Those two have an LL test completed but have not been double-checked. So if there are any more such adjacent primes to be discovered, the currently known Mersenne exponents are probably not among them. |
what would be good for the next milestones?
So, we are getting down to the wire (well, in terms of quantity, not so much date) for the "double check below 10M" and "first time below 56M" or whatever.
Any thoughts on future milestones, and bearing in mind that we may not want to encourage poaching (any more than people might already be doing). Besides just the progression of "we've done exponents up to this amount", are there any other interesting things, like "we've found XX factors" and then we could count down how many until we've factored another 10M or something? Or "we've done xx first time/double-check LL tests"? That way it'd be more about the total work being done, not so much arbitrary ranges and when those are completed. And then if people wanted to pass some milestone or another, it'd involve just doing more work in general (finding more factors, getting more LL tests checked in) and wouldn't tempt people to poach to get it there faster. So that's one idea I had... anything else that sounds fun and would be a neat thing to look at to get some sense of things? GHz-hours done in some period of time, or how many (active) accounts are doing work over the past month, etc. etc... Just some neat stats to look at and ooh/ahh over. |
[QUOTE=Madpoo;390622]Any thoughts on future milestones, and bearing in mind that we may not want to encourage poaching (any more than people might already be doing).[/QUOTE]Having the number of days since the last prime was found would nice.
Some other ideas. "All exponents from M48 to xx,xxx,xxx are fully TF'ed to the new standard bit level and to zz,zzz,zzz to the old standard. " "All exponents from M48 to yy,yyy,yyy have been sufficiently P-1'ed." "LMH-TF is now working at the bb bit level and going through the ccc,000,000 range." "Average TF bit level for the exponents waiting for LL in the 50,000,000 range is bb.bb" "Average TF bit level for the exponents waiting for LL in the 60,000,000 range is bb.bb" "Average TF bit level for the exponents waiting for LL in the 70,000,000 range is bb.bb" "ZZ.ZZ% of all potential candidates in the 40,000,000 range have been already been removed by factoring." "ZZ.ZZ% of all potential candidates in the 50,000,000 range have been already been removed by factoring." "ZZ.ZZ% of all potential candidates in the 60,000,000 range have been already been removed by factoring." Using the last 3 months of DC throughput (not reported completion dates) and a fudge factor of 30-50% calculate the "Estimated completion" for proving M45, etc. Make the date vanish if the estimation is less than 1 year off and replace it with "within a year!" |
[QUOTE=petrw1;390463]Countdown to first time checking all exponents below 56M: 100[/QUOTE]
Countdown to double-checking all 2[SUP]P[/SUP]-1 smaller than 10M digits: 10 |
Another suggested milestone:
"Lowest exponent with an incomplete factorization: x,xxx." and/or "Lowest exponent requiring more ECM: x,xxx." |
Now that "For Research" has completed the poaching of the last two 32M exponents the milestones page can be updated to show the date all exponents below 33M were completed.
|
[QUOTE=Uncwilly;390647]Another suggested milestone:
"Lowest exponent with an incomplete factorization: x,xxx." and/or "Lowest exponent requiring more ECM: x,xxx."[/QUOTE] Targeted milestones. Good idea ! This would mean someone would always be working on these "next" exponents, covering gaps from the bottom quicker. Yes? |
[QUOTE=retina;390650]Now that "For Research" has completed the poaching of the last two 32M exponents the milestones page can be updated to show the date all exponents below 33M were completed.[/QUOTE]
Thanks: milestone page updated. |
[QUOTE=Uncwilly;390631]Having the number of days since the last prime was found would nice.[/QUOTE]
Here is a "countup": [URL="http://www.timeanddate.com/countdown/to?iso=20130125T173026&p0=405&msg=M57%2C885%2C161"]http://www.timeanddate.com/countdown/to?iso=20130125T173026&p0=405&msg=M57%2C885%2C161[/URL] We just passed 1M minutes = 60M sec today. Coming up on 700 days in 5 days. |
[QUOTE=ATH;390684]Here is a "countup":[/QUOTE]I keep track of it myself, along with various aspects related to gaps etc. I was thinking it would be good for the causal page view to see it.
|
[LIST=1][*]Estimated completion dates based on the last {30, 90, 365} days of throughput.[*]The all-time highest 30-day TFLOPS.[*]The highest current "mainstream" (i.e. non-hand-selected) LL/DC assignments.[*]The current bit level and range (e.g. 644M) of LMH assignments. Possibly also add ETC for the bit level.[*]Countdown to the upper end of the classical GIMPS range (79.3M).[*]Countdowns to 100M, 1B.[/LIST]
Depending on how much new information we begin to incorporate, perhaps a new "GIMPS Dashboard" page would be a good idea, so as to avoid taking the Milestones report too far afield from its intended purpose. Another idea: Extend [URL=http://www.mersenne.org/report_classic]this[/URL] report to include higher ranges, e.g. up to 100M, 332M, or even 1B. A button to convert back-and-forth between GHz-days and P90-years would *really* sex things up. |
One more idea: Davieddy's Interval, i.e. the number of days until we expect to find the next Mersenne prime, per Poisson.
|
[QUOTE=NBtarheel_33;390822]One more idea: Davieddy's Interval, i.e. the number of days until we expect to find the next Mersenne prime, per Poisson.[/QUOTE]
:tu: |
The date of the 54M milestone should be January [B]5[/B], 2015, rather than the 2nd.
|
[QUOTE=NBtarheel_33;391690]The date of the 54M milestone should be January [B]5[/B], 2015, rather than the 2nd.[/QUOTE]I concur, if UTC is being upheld.
|
[QUOTE=NBtarheel_33;391690]The date of the 54M milestone should be January [B]5[/B], 2015, rather than the 2nd.[/QUOTE]Ah, I see Madpoo poached the remaining exponent on 5-1-2015.
|
[QUOTE=NBtarheel_33;391690]The date of the 54M milestone should be January [B]5[/B], 2015, rather than the 2nd.[/QUOTE]
Yeah, you're right. And what's worse, I checked in that last result so I have nobody to blame but myself. Apparently I forgot what day it was... that happens on long vacations. :smile: Corrected now. |
[QUOTE=Madpoo;391724]Yeah, you're right. And what's worse, I checked in that last result so I have nobody to blame but myself. Apparently I forgot what day it was... that happens on long vacations. :smile:
Corrected now.[/QUOTE]Have you decided to finish all the <10M exponents? The current holders seem to be prematurely expired with "Manual testing" by Madpoo. |
[QUOTE=retina;391725]Have you decided to finish all the <10M exponents? The current holders seem to be prematurely expired with "Manual testing" by Madpoo.[/QUOTE]
I did a couple where it seemed like they'd stopped checking in. There's still those last 3 that have been plugging away that seem fine though. They'll take a few more months to finish, but at least they're making progress. As of right now I actually don't have any machines running Prime95... feels kind of weird after years of having at least one system going. My system doesn't do well running P95 when it's overclocked, so I wanted to finish it up at regular speed and then set it back to overclocking for the other stuff I have it doing. I'll probably fire up an instance for a few days at a time when I'm burning in a new system, but for now I may as well focus on making sure the website stuff looks good. Maybe doing the odd test here and there when I see weird entries in the database (false positives, etc). |
[QUOTE=Madpoo;391767]I did a couple where it seemed like they'd stopped checking in. There's still those last 3 that have been plugging away that seem fine though. They'll take a few more months to finish, but at least they're making progress.[/QUOTE]So your personal criteria for deciding if they have "stopped checking in" is "it seemed like they'd stopped"?
So when arnaud returns from the Christmas break (last check-in 24-Dec) and turns on his/her computer to continue the LL test he/she will get a message saying (in effect) "Tough luck, Madpoo finished it for ya, hope you don't mind but you were tardy and we are intolerant of tardiness." |
[QUOTE=retina;391780]So your personal criteria for deciding if they have "stopped checking in" is "it seemed like they'd stopped"?
So when arnaud returns from the Christmas break (last check-in 24-Dec) and turns on his/her computer to continue the LL test he/she will get a message saying (in effect) "Tough luck, Madpoo finished it for ya, hope you don't mind but you were tardy and we are intolerant of tardiness."[/QUOTE] If that happens, I'll manually donate twice the applicable credit from my account to theirs. I did two exponents in that <10M range... one hadn't been checked in since mid November I think, and then the one you're talking about. I don't actually recall the details but you're probably right. |
[QUOTE=Madpoo;391827]If that happens, I'll manually donate twice the applicable credit from my account to theirs.[/QUOTE]
If I was on the receiving end of that, I would interpret it as: "here's some meaningless compensation to make you feel less bad about being a worthless contributor to GIMPS". |
[QUOTE=Brian-E;391831]If I was on the receiving end of that, I would interpret it as: "here's some meaningless compensation to make you feel less bad about being a worthless contributor to GIMPS".[/QUOTE]
Well, whatever the case, I will probably end up owing Arnaud a very public apology for jumping the gun. Looks like that assignment did check in again today at any rate. See, this is why madpoo's shouldn't poach, they get themselves into trouble. I hope you all learned the lesson I was trying to teach (yeah, right) that poaching is something that should only be done by experts who actually look at more than just a sample day here and there to see if work is stopped. I am not a trained professional and I should not have tried that at home. |
[QUOTE=Madpoo;391834]Well, whatever the case, I will probably end up owing Arnaud a very public apology for jumping the gun. Looks like that assignment did check in again today at any rate.
See, this is why madpoo's shouldn't poach, they get themselves into trouble. I hope you all learned the lesson I was trying to teach (yeah, right) that poaching is something that should only be done by experts who actually look at more than just a sample day here and there to see if work is stopped. I am not a trained professional and I should not have tried that at home.[/QUOTE] :goodposting: Your candidness and honesty are admirable. I can learn from it. Now, unfortunately not everyone has the same good conscience as you have and there are other people who don't learn the lesson which you detail above or just don't care. We shall soon will see whether the new stricter re-assignment rules for preferred exponents eliminate poaching when all remaining assignments are governed by those new rules. They certainly, in my opinion, eliminate any [U]need[/U] to poach (if there ever was any), and I feel confident that instances of poaching will be reduced significantly compared to what the problem used to be, but I also have a hunch that poaching will not be entirely eliminated. Do you, and George, think that not showing the results of poached assignments in the database, at least until the original assignment completes or expires, is a technically feasible and desirable approach? |
It shouldn't even be called poaching when the "experts" resolve a "problem."
And poaching by non-experts should (I agree) be avoided scrupulously. |
3 DC left below 10M Digits....all assigned since the new rules came into effect....shouldn't be long now.
18 old assignments left for LLK to 56M ...all but 2 have very recent update; those 2 relatively recent...might just be an Xmas break. I'm thinking 1 will be recycled in a couple days....almost no progress in a year....they rest of the oldies look like they may finish |
Exponent 33088277 either has some missing assignment data, or the expiry date for "ANONYMOUS" is incorrect. There appears to be a large gap of 10 months without any LL activity, but it was in the active assignments list as being worked on for that 10 month period.
[url]http://www.mersenne.org/report_exponent/?exp_lo=33088277&full=1[/url] |
[QUOTE=retina;391922]Exponent 33088277 either has some missing assignment data, or the expiry date for "ANONYMOUS" is incorrect. [/QUOTE]
I haven't dug into the logfiles. Perhaps the user unreserved the exponent. |
[QUOTE=Prime95;391929]I haven't dug into the logfiles. Perhaps the user unreserved the exponent.[/QUOTE]If I had to guess I would think the code to expire underperforming assignments has set the expired date using the 60-day rule to the assigned date + 60 days rather than using the current date.
|
[QUOTE=davar55;391861]It shouldn't even be called poaching when the "experts" resolve a "problem."
And poaching by non-experts should (I agree) be avoided scrupulously.[/QUOTE] We could always remove the "active assignments" page entirely. :) But I know some folks like the data and use it responsibly. I'll let George puzzle that one out... He's talked about that particular page in the past with a sense of regret. :smile: |
[QUOTE=Madpoo;391958]We could always remove the "active assignments" page entirely. :) But I know some folks like the data and use it responsibly. I'll let George puzzle that one out... He's talked about that particular page in the past with a sense of regret. :smile:[/QUOTE]
Use a password? (like... for trusted users, you know? :P - it can be the same password for all, not as much as for protection, but to descourage the badass teenagers) |
[QUOTE=LaurV;392025]Use a password? (like... for trusted users, you know? :P - it can be the same password for all, not as much as for protection, but to descourage the badass teenagers)[/QUOTE]
Yep, then only give the password to people who provide a copy of their birth certificate together with a photo of their derrière, showing that they are not badass teenagers.:razz: |
[QUOTE=Brian-E;392048]provide a copy of their birth certificate together with a photo of their derrière[/QUOTE]
Well, technically, you only need to provide one or the other to disprove that you're a BaT. :razz: /my-contribution-to-the-less-than-useful-thread |
According to WolframAlpha there are [url=http://www.wolframalpha.com/input/?i=number+of+prime+numbers+below+%28log2%2810%29*10^7%29]2044287[/url] exponents below the 10M digit mark. And as of this posting there are M2 remaining to be verified to complete the set.[QUOTE=LaurV;392025]Use a password? (like... for trusted users, you know? :P - it can be the same password for all, not as much as for protection, but to descourage the badass teenagers)[/QUOTE]Perhaps you are protecting against the wrong class of potential poacher. Past records here indicate the opposite; established and trusted individuals are the main poachers. So we should be freely giving out the information to all that request it in an effort to attract more to join and replace those that get shafted by a poacher and leave for other projects.
|
Mea culpa. I wasn't talking about poaching, I knew the objection (opposition) to the active assignment page was more or less related to the fact that (in George's approx. words) "it kills the server" and therefore it should not be accessible to everybody. Not any more the case with the new server, I guess. Sorry I didn't read the discussion carefully.
The poaching we can not stop, unless the rules are enforced and unless the poachers see that they don't have any advantage in poaching, for example no matter who reports the result, is should be credited to the assignee (just an example, and not the best/easiest solution). But to do this in a "fair mode", again, we need to enforce the rules, good or bad, they are there to be applied. Otherwise it is a big can of worms, if one reserves a billion exponents and never work on them, and they are not expired (by the server) when the time comes (as it was the case with the old rules), other people do his work and he will get the credit. It was done in the past (it was called "hoarding" exponents, and even curtisc did it. But you (generic you) can't play tricks with the credits of the one who reports the results. Even so, some people don't care (I am guilty :redface:, I confess) if they want to poach, they will poach, for the sake of it, for whatever silly reason like advancing a milestone or satisfying a personal pride or stupidity (ex: I use to DC my old LL work of 5-7-10 years ago without giving a sh!t if and to who is assigned, you know, what if one of my computers at that time went nuts and missed a prime? :razz:). (what I wanted to stress is that not the rules are the problem, but enforcing them; most of the time when things go wrong, the "management" would change the rules, when in fact the old rules are quite ok, or even better, the real problem is how those rules are applied in practice). All in all we are making too much of a case from it. I think we are moving in the right direction, day by day (new server, new rules, etc, better terms to accommodate newer/faster CPUs, soon we will have GPU work directly from PrimeNet, etc), but we can't avoid kibitzing and making a big fuss of it. Let the things move on, at least, or help if you can. They progress slowly, but progress. |
It has been almost 2.5 months, good progress has been made on the various milestones.
All exponents below [B][COLOR="DarkGreen"]33,121,687[/COLOR][/B] have been tested and double-checked. All exponents below [B][COLOR="Blue"]54,357,769[/COLOR][/B] have been tested at least once. Countdown to testing all exponents below M([B][COLOR="Blue"]57885161[/COLOR][/B]) once: 3,985 Countdown to double-checking all 2[SUP]P[/SUP]-1 smaller than 10M digits: [B][COLOR="Red"]2[/COLOR][/B] (Estimated completion : [COLOR="Green"]2015-02-09[/COLOR]) Countdown to first time checking all exponents below 56M: [B][COLOR="Red"]29[/COLOR][/B] (Estimated completion : [COLOR="Green"]2015-05-16[/COLOR]) Countdown to proving M([COLOR="Green"]37156667[/COLOR]) is the [COLOR="green"]45[/COLOR]th Mersenne Prime: 52,245 [QUOTE=NBtarheel_33;390822]One more idea: Davieddy's Interval, i.e. the number of days until we expect to find the next Mersenne prime, per Poisson.[/QUOTE]Just as a note: we are now at 0.500 expected new primes in the 79.3 range. |
[QUOTE=Uncwilly;393623]It has been almost 2.5 months, good progress has been made on the various milestones.
Countdown to first time checking all exponents below 56M: [B][COLOR="Red"]29[/COLOR][/B] (Estimated completion : [COLOR="Green"]2015-05-16[/COLOR])[/QUOTE] We're also just 3 exponents away from first time checks up to 55M. I didn't include that in the milestone page at the time because it was already a small number, but it's in there... 2 should have been due but haven't checked in for a few days. Hopefully that user is still running them and they'll pop in. The 3rd one is a couple more weeks out but making somewhat consistent progress. |
[QUOTE=Uncwilly;393623]Just as a note: we are now at 0.500 expected new primes in the 79.3 range.[/QUOTE]
0.497 now, so (with the standard caveat about calculations of this nature over extremely short intervals) that's a drop of 0.003 in six days, or basically 2,000 days (~5.5 years) until we expect to find a new prime. |
[QUOTE=NBtarheel_33;394119]0.497 now, so (with the standard caveat about calculations of this nature over extremely short intervals) that's a drop of 0.003 in six days, or basically 2,000 days (~5.5 years) until we expect to find a new prime.[/QUOTE]
To me that sounds a bit fishy. How did you calculate it? Are you sure you're not just estimating the time until all exponents under 79.3M have been tested? |
[QUOTE=NBtarheel_33;394119]0.497 now, so (with the standard caveat about calculations of this nature over extremely short intervals) that's a drop of 0.003 in six days, or basically 2,000 days (~5.5 years) until we expect to find a new prime.[/QUOTE]
It 2014 it dropped by 0.205 . Use that figure instead. I can give you a lot more data to play with if you want. |
[QUOTE=Uncwilly;394138]It 2014 it dropped by 0.205 . Use that figure instead. I can give you a lot more data to play with if you want.[/QUOTE]
That would give 365/0.205 = ~1,780 days, or just under 5 years. Back when Davieddy was stalking the forum, he was consistently getting right around 4 years. So either GIMPS is slowing down (possible but the throughput numbers seem to say differently), or (more likely) we are seeing the effects of (1) the DC tail being cut down to size and (2) the greater computational expense of LL tests in the highest part of the "classical" range. |
[QUOTE=NBtarheel_33;394211]That would give 365/0.205 = ~1,780 days, or just under 5 years.[/QUOTE]
If you mean the expected time until the discovery of a new prime, please could you justify it for the benefit of those of us who are interested but don't understand?:smile: |
[QUOTE=Brian-E;394226]If you mean the expected time until the discovery of a new prime, please could you justify it for the benefit of those of us who are interested but don't understand?:smile:[/QUOTE]
First of all, keep in mind that in these calculations, we are making the [B]huge[/B] assumption that the distribution of Mersenne primes obeys a Poisson distribution. We have no proof that this is the case, but heuristics indicate that it is plausible. Assuming a Poisson distribution, we can then calculate the probability of finding a prime in a given interval of exponents (e.g. between 2 and 79,300,000, the classical GIMPS upper limit; or between 50,000,000 and 60,000,000, etc.). From this, we can calculate the number of primes that we might expect to find in such an interval. Right now, for instance, between exponents 2 and 79,300,000, we presently expect to find 0.496 primes, per [URL="http://www.mersenne.org/report_classic/"]this[/URL] report. (Keep in mind that this is making that all-important, nontrivial, unproven assumption that the distribution of Mersenne primes is a Poisson process!) Well, from the change in this expected number of primes over a time interval, we can estimate the time interval in which we would expect to find exactly one prime. This is a calculation that Davieddy would frequently make and from which he would infer the increase (or decrease) in GIMPS throughput (whether this is a valid metric for measuring GIMPS throughput is another argument for another time). Anyway, the logic is as follows: If the expected number of primes in an interval decreases by some amount [TEX]dE[/TEX] over a time interval [TEX]dT[/TEX], then [TEX]\frac{dE}{dT}[/TEX] roughly approximates the expected number of primes per unit of time (usually we measure [TEX]dT[/TEX] in days). Once we know the expected number of primes per unit of time, we can flip this around to ask the question: How long before the expected number of primes is exactly one? To calculate this, we simply invert [TEX]\frac{dE}{dT}[/TEX] (which gives us primes per unit time interval) and calculate [TEX]\frac{dT}{dE}[/TEX] (which gives us time intervals per unit prime (i.e. how long for the expected number of primes to drop by exactly one). This quantity tells us (assuming the Poisson distribution holds!) the length of time from right now that we should expect to wait before we find a new prime. Note that, as Davieddy often remarked in the past, it doesn't matter how long we've *been* waiting; this figure tells us how much longer we should [B]expect[/B] to wait. So, even if we haven't found a new prime in, say, twenty years, if the figure [TEX]\frac{dT}{dE} = 20[/TEX] years (Heaven forbid!), according to Poisson, we should expect another twenty years of waiting. Finally, an example, just to make things as clear as mud. As Uncwilly posted upthread, the expected number of primes dropped by 0.205 in 2014. This gives [TEX]dE = 0.205[/TEX] and [TEX]dT = 365[/TEX] (the length of the time interval is 365 days, i.e. the entire year 2014). We first note that this means that we should expect to find [TEX]\frac{dE}{dT} = \frac{0.205}{365} = 0.00056[/TEX] new primes per day. On the other hand, this also means that we should expect it to be [TEX]\frac{dT}{dE} = \frac{1}{\frac{dE}{dT}} = \frac{1}{0.00056} = ~1,786[/TEX] days [B]from the end of 2014[/B] (i.e. November 21, 2019) before we find a new prime. (Assuming that the Mersenne primes are distributed in a way that obeys the Poisson distribution!) Hope this helps. Let me know if you still have questions. :smile: |
[QUOTE=NBtarheel_33;394262]Hope this helps. Let me know if you still have questions. :smile:[/QUOTE]
Thanks very much for the very detailed description. What I didn't understand, but I think I do now thanks to your explanation, is how it was justified to use the decrease in the expected number of primes in a particular arbitrary interval (up to exponent 79.3M in the example). But now I see that you use a gradient (rate of decrease) of expected number of primes over that same interval and assume that it will apply over any larger interval (Poisson distribution). Thanks for your patience with me. |
Only 44 Mersenne primes below 10M digits
And as expected the [url=http://www.mersenne.org/report_exponent/?exp_lo=33185861&full=1]last of the <10M digit exponents[/url] was indeed finished by someone other than the registered [strike]slowcoach[/strike] user. This time by "Mike Neurohr".
|
[QUOTE=retina;394304]And as expected the [url=http://www.mersenne.org/report_exponent/?exp_lo=33185861&full=1]last of the <10M digit exponents[/url] was indeed finished by someone other than the registered [strike]slowcoach[/strike] user. This time by "Mike Neurohr".[/QUOTE]
This looks like it might very likely have been a legitimate "recycling" by Primenet, rather than a "poaching" (my spidering is not at a high enough temporal resolution to be able to say for sure). [CODE]20141115 33185861 D LL, 54.90% 143 26 2014-06-25 2014-11-14 2014-11-15 2014-12-11 nranks 20141123 33185861 D LL, 54.70% 151 27 2014-06-25 2014-11-23 2014-11-24 2014-12-20 nranks 20141210 33185861 D LL, 53.50% 168 28 2014-06-25 2014-12-09 2014-12-10 2015-01-07 nranks 20150101 33185861 D LL, 58.20% 221 18 2014-06-25 2015-01-30 2015-01-31 2015-02-19 nranks[/CODE] Assigned to "nranks" under the new recycling rules (possibly as a "Cat 2"); gratiously given over 220 days to complete. Making very slow progress. |
[QUOTE=chalsall;394320]This looks like it might very likely have been a legitimate "recycling" by Primenet, rather than a "poaching" (my spidering is not at a high enough temporal resolution to be able to say for sure).[/QUOTE]I doubt that "Mike Neurohr" could get the assignment and then finish the test in 0.0 days. The Recent Cleared report was very clear about the timespan of 0.0 days.
|
[QUOTE=retina;394322]I doubt that "Mike Neurohr" could get the assignment and then finish the test in 0.0 days. The Recent Cleared report was very clear about the timespan of 0.0 days.[/QUOTE]
As I said, I don't have enough information to say for sure. But I can tell you that many of my machines can clear two DCs in less than 24 hours. |
[QUOTE=chalsall;394323]As I said, I don't have enough information to say for sure. But I can tell you that many of my machines can clear two DCs in less than 24 hours.[/QUOTE]I'm not sure about the rounding used but I suppose at the most it could be 0.099999... days, i.e. just under 2.4 hours. Is such a time possible with current technology?
|
[QUOTE=retina;394348]I'm not sure about the rounding used but I suppose at the most it could be 0.099999... days, i.e. just under 2.4 hours. Is such a time possible with current technology?[/QUOTE]
Moot point since I don't see that exponent being assigned to anyone else. I've officially removed that little item from the progress section of the milestones page. Is it worth mentioning in the "older/lower profile milestones" section, that today is the day we finished all the 10M doublechecks? It was kind of interesting to see it countdown, but how important is that really? I guess no more or less important than any other artificial milestone already listed, but you know what I mean. :smile: |
[QUOTE=Madpoo;394358]Moot point since I don't see that exponent being assigned to anyone else.
I've officially removed that little item from the progress section of the milestones page. Is it worth mentioning in the "older/lower profile milestones" section, that today is the day we finished all the 10M doublechecks? It was kind of interesting to see it countdown, but how important is that really? I guess no more or less important than any other artificial milestone already listed, but you know what I mean. :smile:[/QUOTE] I think it is worth making a note of this for posterity. We already have listed the date on which we finished double-checking everything under one million digits. We also have listed the dates of the first successful ten-million-digit test and the completion of the last first-time test under ten million digits. |
[QUOTE=chalsall;394320][CODE]20141115 33185861 D LL, 54.90% 143 26 2014-06-25 2014-11-14 2014-11-15 2014-12-11 nranks
20141123 33185861 D LL, [COLOR="Red"]54.70[/COLOR]% 151 27 2014-06-25 2014-11-23 2014-11-24 2014-12-20 nranks 20141210 33185861 D LL, [COLOR="red"]53.50[/COLOR]% 168 28 2014-06-25 2014-12-09 2014-12-10 2015-01-07 nranks 20150101 33185861 D LL, [COLOR="SeaGreen"]58.20[/COLOR]% 221 18 2014-06-25 2015-01-30 2015-01-31 2015-02-19 nranks[/CODE][/QUOTE]Looks like the machine was having problems and went backwards. |
[QUOTE=Uncwilly;394374]Looks like the machine was having problems and went backwards.[/QUOTE]
It's too bad the user didn't figure out that if one starts at the final iteration and works backward, there isn't any need to go all the way to the very beginning. |
[QUOTE=NBtarheel_33;394373]I think it is worth making a note of this for posterity. We already have listed the date on which we finished double-checking everything under one million digits. We also have listed the dates of the first successful ten-million-digit test and the completion of the last first-time test under ten million digits.[/QUOTE]
True, I guess it can't hurt, since those other things are in there. Added. I was torn between these two versions: All exponents below 33,219,253 (10 million digits) double-checked. or this: All Mersenne numbers up to 10 million digits double-checked. I left the first one there for now... it seems more precise and less likely to confuse, but I'm looking at it from my own perspective...not sure which is more readable for the average Joe. |
[QUOTE=chalsall;394320]This looks like it might very likely have been a legitimate "recycling" by Primenet, rather than a "poaching" (my spidering is not at a high enough temporal resolution to be able to say for sure).
[CODE]20141115 33185861 D LL, 54.90% 143 26 2014-06-25 2014-11-14 2014-11-15 2014-12-11 nranks 20141123 33185861 D LL, 54.70% 151 27 2014-06-25 2014-11-23 2014-11-24 2014-12-20 nranks 20141210 33185861 D LL, 53.50% 168 28 2014-06-25 2014-12-09 2014-12-10 2015-01-07 nranks 20150101 33185861 D LL, 58.20% 221 18 2014-06-25 2015-01-30 2015-01-31 2015-02-19 nranks[/CODE] Assigned to "nranks" under the new recycling rules (possibly as a "Cat 2"); gratiously given over 220 days to complete. Making very slow progress.[/QUOTE] I have access to this machine. It makes slow progress due to being an Intel i5 M 520 @ 2.40GHz running two double checks, and only being on 8 to 9 hours a day. I am not aware of any hardware issues, as it has only returned good results. What would be a more appropriate work type if too slow for double checks? ECM? Some other project? Nothing? |
[QUOTE=potonono;394387]I have access to this machine. It makes slow progress due to being an Intel i5 M 520 @ 2.40GHz running two double checks, and only being on 8 to 9 hours a day. I am not aware of any hardware issues, as it has only returned good results. What would be a more appropriate work type if too slow for double checks? ECM? Some other project? Nothing?[/QUOTE]
Have a look at [url]http://www.mersenne.org/thresholds/[/url]. Are you receiving assignments from an inappropriate category? If not, your CPU is fast enough. At any rate, one thing you can do is, instead of running two doublechecks simultaneously, run 1 DC with two threads. That would reduce the time taken without (hopefully) too much loss in productivity. |
[QUOTE=TheMawn;394378]It's too bad the user didn't figure out that if one starts at the final iteration and works backward, there isn't any need to go all the way to the very beginning.[/QUOTE]
:rofl: :w00t: :tu: [edit: joking apart, assuming one has a very fast "magic" method to extract a modular square root for a non-prime modulus, then it should worth starting from the end, for example you take the last iteration which you assume it gave x^2-2=0, and find x as sqrtmod(2,Mp) (this is trivial), then previous iteration y^2-2=x, and find y as sqrtmod(x+2, Mp), and so on. It can be proved that you can go all the way to the first iteration only if Mp is prime. Otherwise you are dead after the second step, i.e. sqrtmod(sqrtmod(2,Mp)+2,Mp) has no solution for a composite Mp, which would be a very fast compositeness test for mersenne numbers]. [edit 2: of course, if one would have such a "magic" method, we won't need any test at all, because factoring the Mp would be easier, one would use this method to find a non-trivial square root of 1 (or of 2) mod Mp and apply difference of squares, considering 1 is a square, or considering that a trivial root of 2 is 2^((p+1)/2). Ex, if we want to factor 2047, we apply the magic square root method to find that 622 is a square root of 1 mod 2047, because 622^2=1 (mod 2047), so (622-1)(622+1)=0 mod 2047, and we get the factors as gcd(621,2047)=23 and gcd (623,2047)=89. Or, we can find a nontrivial square root of 2, i.e. 915^2=2 (mod 2047), and we know a trivial root of 2, i.e. 2^((11+1)/2)=2^6=64, because 64^2=2 (mod 2047) so we get (915-64)(915+64)=0 (mod 2047) and taking the gcds we find the two factors] |
[QUOTE=NBtarheel_33;394373]We already have listed the date on which we finished double-checking everything under one million digits.[/QUOTE]
Where is that date? It is somewhere between these 2 but I do not see it: May 20, 2001 All exponents less than 4,000,000 double-checked. May 19, 2000 Double-checking proves M(2976221) and M(3021377) are the 36th and 37th Mersenne primes. I can narrow it down using some old status files I have: January 28th, 2001: All exponents below 3,210,800 have been tested and double-checked. March 25th, 2001: All exponents below 3,502,500 have been tested and double-checked. |
Checking the milestones list against some old status files:
April 15, 2002 All exponents below 9,000,000 tested at least once. This seems a few days late: March 31, 2002 All exponents below 8,574,000 have been tested at least once. April 7, 2002 All exponents below 9,005,900 have been tested at least once. 7M double-check milestones is missing, it is approximately: Februar 16, 2003 All exponents below 6,977,600 have been tested and double-checked. Februar 27, 2003 All exponents below 7,060,000 have been tested and double-checked. Since august 2009 until october 2014 I have archived the milestones list hourly. Double-check milestones: October 30, 2009 All exponents below 19,000,000 double-checked. (This milestone occurred October 28, 2009 at 12pm UTC) 21M double-check is missing: It occurred July 11, 2010 8am UTC) 23M and 24M double-check is missing: It occurred December 1, 2011 3am UTC (went from 22,545,883 to 24,052,939) 26M double-check is missing: It occured December 20, 2012 4am UTC First time milestones: 28M: February 10, 2010 4pm UTC 29M: July 14, 2010 11pm UTC 30M: August 2, 2010 2pm UTC 31M: August 6, 2010 6pm UTC 32M-37M: December 25 2010 7am UTC (went from 31,494,937 to 37,591,483) 38M: July 19, 2011 1am UTC 39M: July 22, 2012 10am UTC 40M: August 2, 2012 3am UTC 41M: August 5, 2012 4pm UTC 42M+43M: September 5, 2012 12pm UTC (went from 41,959,639 to 43,142,591) 44M: November 9, 2012 12am UTC 45M: December 12, 2013 12am UTC Milestones graph: [URL="www.hoegge.dk/mersenne/milestones.png"]milestones.png[/URL] |
[QUOTE=ATH;394407]Where is that date? It is somewhere between these 2 but I do not see it:
May 20, 2001 All exponents less than 4,000,000 double-checked. May 19, 2000 Double-checking proves M(2976221) and M(3021377) are the 36th and 37th Mersenne primes. I can narrow it down using some old status files I have: January 28th, 2001: All exponents below 3,210,800 have been tested and double-checked. March 25th, 2001: All exponents below 3,502,500 have been tested and double-checked.[/QUOTE] You're right. It looks like we only recorded the completion of all [B]first-time[/B] tests under a million digits: December 26, 1998 All Mersenne numbers less than a million digits tested at least once. |
February 3, 2015 All exponents below 33,219,253 (10 million digits) double-checked.
Just because I'm feeling extra nitpicky this afternoon: 33,219,2[B]53[/B] in the above should really be 33,219,2[B]81[/B]. |
[QUOTE=ATH;394443]Checking the milestones list against some old status files:
Milestones graph: [URL="www.hoegge.dk/mersenne/milestones.png"]milestones.png[/URL][/QUOTE] It seems that you have some data points on that that I don't have (2008 and 2009 especially.) Would you be willing to share what you have? I can PM you my e-mail address. |
[QUOTE=NBtarheel_33;394455]February 3, 2015 All exponents below 33,219,253 (10 million digits) double-checked.
Just because I'm feeling extra nitpicky this afternoon: 33,219,2[B]53[/B] in the above should really be 33,219,2[B]81[/B].[/QUOTE] I thought the first 2^n-1 that resulted in 10M digits is n=33219280 (which isn't prime, but doesn't matter). 2^33,219,281 - 1 would have more than 10M digits, wouldn't it? I don't remember where or how I came up with 33219280 as the magic number, but that's what I was using as a basis. I don't have the energy to double-check right now... past my bedtime. :smile: |
[QUOTE=Uncwilly;394498]It seems that you have some data points on that that I don't have (2008 and 2009 especially.) Would you be willing to share what you have? I can PM you my e-mail address.[/QUOTE]
Maybe I can look at the milestone list and double-check, add the missing n-millionth items and verify the ones listed. They're probably okay. |
[QUOTE=Madpoo;394509]I thought the first 2^n-1 that resulted in 10M digits is n=33219280 (which isn't prime, but doesn't matter). 2^33,219,281 - 1 would have more than 10M digits, wouldn't it?
I don't remember where or how I came up with 33219280 as the magic number, but that's what I was using as a basis. I don't have the energy to double-check right now... past my bedtime. :smile:[/QUOTE] All exponents [B][I][U]below[/U][/I][/B] (I bolded, underlined & italicized for added effect :smile:) |
[QUOTE=axn;394511]All exponents [B][I][U]below[/U][/I][/B] (I bolded, underlined & italicized for added effect :smile:)[/QUOTE]All exponents [color=red][B][I][U][size=5]below[/size][/U][/I][/B][/color] (I bolded, underlined, italicised coloured & enbiggened for added effect :smile:)
|
[QUOTE=Madpoo;394509]I thought the first 2^n-1 that resulted in 10M digits is n=33219280 (which isn't prime, but doesn't matter). 2^33,219,281 - 1 would have more than 10M digits, wouldn't it?
I don't remember where or how I came up with 33219280 as the magic number, but that's what I was using as a basis. I don't have the energy to double-check right now... past my bedtime. :smile:[/QUOTE] Actually if we are to be nitpicking...:razz: [CODE] gp > ceil(332192[COLOR=Red][B]77[/B][/COLOR]*log(2)/log(10)) [COLOR=Magenta]time = 1 ms. [/COLOR]%1 = 9999999 gp > ceil(332192[B][COLOR=Red]78[/COLOR][/B]*log(2)/log(10)) [COLOR=Magenta]time = 1 ms. [/COLOR]%2 = 10000000 gp > length(Str(1<<13-1)) %3 = 4 gp > length(Str(1<<14-1)) %4 = 5 gp > length(Str(1<<332192[COLOR=Red][B]77[/B][/COLOR]-1)) [COLOR=Magenta]time = 14,928 ms.[/COLOR] %5 = [B]9999999[/B] gp > length(Str(1<<332192[COLOR=Red][B]78[/B][/COLOR]-1)) [COLOR=Magenta]time = 15,241 ms.[/COLOR] %6 = [B]10000000[/B] gp > precprime(332192[COLOR=Red][B]78[/B][/COLOR]) [COLOR=Magenta]time = 1 ms.[/COLOR] %7 = 332192[B][COLOR=Red]53[/COLOR][/B] gp > nextprime(332192[COLOR=Red][B]78[/B][/COLOR]) %8 = 332192[B][COLOR=Red]81[/COLOR][/B] gp > [/CODE] (the log/ceil stuff is sometime tricky, if not enough precision in floats, here the precision is high enough, but usually "printing it" in a decimal string will resolve all the arguments, even if this is very slow, especially in pari/gp) |
[QUOTE=LaurV;394513](the log/ceil stuff is sometime tricky, if not enough precision in floats, here the precision is high enough, but usually "printing it" in a decimal string will resolve all the arguments, even if this is very slow, especially in pari/gp)[/QUOTE]I would have thought the standard log function would suffice:
(10^7-1)/log10(2) = 33,219,277.62..... And no more than 10 digits of precision needed. The next decade is: (10^8-1)/log10(2) = 332,192,806.16... |
[QUOTE=retina;394512]All exponents [SIZE=4][COLOR=Red][B]ʍo|ǝq[/B][/COLOR][/SIZE] (I [STRIKE]bolded, underlined, italicised coloured & enbiggened[/STRIKE][B] uʍop ǝpısdn ʇı pǝddı|ɟ[/B] for added effect :smile:)[/QUOTE]
noʎ ɹoɟ ʇɐɥʇ pǝxıɟ... |
[QUOTE=retina;394514]I would have thought the standard log function would suffice:
(10^7-1)/log10(2) = 33,219,277.62..... And no more than 10 digits of precision needed. The next decade is: (10^8-1)/log10(2) = 332,192,806.16...[/QUOTE] Try: (10^7)/log10(2) = ? (and then start thinking what happens with the exponents in between...) :razz: |
| All times are UTC. The time now is 21:12. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.