![]() |
our first "dry" year in four years
From 2003 to 2006, GIMPS discovered at least one Mersenne prime each year. However, we didn't discover one in 2007. On the bright side, the longer we wait for somrthing, the better that "something" will be. When we discover the 45th Mersenne prime, it will likely be [i]very[/i] large.
That doesn't mean that we won't discover it tomorrow, though. :D |
[QUOTE=ixfd64;121899]From 2003 to 2006, GIMPS discovered at least one Mersenne prime each year. However, we didn't discover one in 2007. On the bright side, the longer we wait for somrthing, the better that "something" will be. When we discover the 45th Mersenne prime, it will likely be [i]very[/i] large.
That doesn't mean that we won't discover it tomorrow, though. :D[/QUOTE] The question is whether the first 10M-digit prime will be a Mersenne. I predict it will be of the form k*2^n+c, and be announced very soon. |
[quote=jasong;122028]The question is whether the first 10M-digit prime will be a Mersenne. I predict it will be of the form k*2^n+c, and be announced very soon.[/quote]
I doubt the first 10M-digit won't be Mersenne. The largest known prime has been a Mersenne prime since digital computers began, and currently the list of largest primes is solid Mersenne's up through number 6, and the next k*2^n+c is under 4M-digit. (No, correlation does not imply causation, but there is indeed a cause. i.e. the extremely efficient LL test.) Technically, Mersenne's do fit into your k*2^n+c, with k=1 and c=-1, but I know that's not what you meant. Is there a specific DC project you think will find it? Are they testing many 10M-digit's? How long does it take for each number? |
Several trends point to the middle 1/3 of '08 as being a likely point in time.
If they run ~18 months apart and the last was at the end of '06... |
[quote=Mini-Geek;122037]The largest known prime has been a Mersenne prime since digital computers began[/quote]
With notable exceptions in 1951 and 1989 [URL]http://primes.utm.edu/notes/by_year.html#2[/URL] |
[quote=Uncwilly;122038]Several trends point to the middle 1/3 of '08 as being a likely point in time.
If they run ~18 months apart and the last was at the end of '06...[/quote] This "reasoning" is wrong. If the expected time per prime were 18 months (and currently it is nearer 4 years), it is the expected wait from "now", no matter how much time has elapsed since the last prime. Think of how many throws of a die till a six turns up. |
I've been wondering how likely we are to see a maximal gap between M44 and M45. Obviously the whole discussion has to be subject to eventual double checks, but the leading edge for first time checks now has exponents over 7½ million past M44. So naively one* would now expect M45 to be more than 40,111,751.
Richard * this 'one' anyway |
[quote=Richard Cameron;122061]I've been wondering how likely we are to see a maximal gap between M44 and M45.[/quote]
I'm curious about what you mean with "maximal gap". As far as our knowledge to date is concerned, the gap between two consecutive mersenne primes could be arbitrarily large. And indeed, no-one as far as I know has managed to prove that there are infinitely many Mersenne primes or even that there are any more beyond what we have already found. If GIMPS doesn't find any more mersenne primes in the forseeable future, the project is nonetheless continually extending the boundary of how far we know which Mersenne numbers are prime and which are composite. With that we are very gradually increasing our knowledge of their distribution, though this process is painfully slow. Some new breakthrough in the theory of how to test these numbers is probably the best hope for any significant progress now, I think. That said, discovering M45 in the coming months or years would be a very nice boost for morale. |
[quote=Richard Cameron;122061]I've been wondering how likely we are to see a maximal gap between M44 and M45. Obviously the whole discussion has to be subject to eventual double checks, but the leading edge for first time checks now has exponents over 7½ million past M44. So naively one* would now expect M45 to be more than 40,111,751.
Richard * this 'one' anyway[/quote] Very likely. But what the heuristics say is that we expect the next exponent to be ~1.5 times the last one, so the expected "arithmetical" gap increases by 1.5 each time. Note that the biggest ratio so far between exponents is that from M127 (Lucas 1876) to M521 (Robinson with help from Lehmer 1952). Let's hope we aren't due to beat this record:smile: David |
[QUOTE=Brian-E;122064]I'm curious about what you mean with "maximal gap". [/QUOTE]
I meant largest (discovered) arithmetic gap in exponent. I agree that since we don't know the distribution of mersenne primes, we can't make much in the way of predictions beyond that. Davieddy: you are right, the ratio Mn/Mn-1 is probably a more meaningful comparator. [QUOTE=Davieddy]let's hope we aren't due to beat this record [/QUOTE] nor the record 897 days between M38 and M39 Richard |
We should hope to get two records in one shot :evil:
This might sound good on a press release: "After a record 1027 days since the last discovery, GIMPS has found a new record prime ..." |
Statistically speaking ...
... albeit from a non-statistician
From the GIMPS Status page: - expect about 2 more MP's (1.93) up to 79,300,000. - to complete the range will require 22,004,934 P90 Years. - in 2006 887,665 P90 years done - in 2007 971,861 P90 years done - assume in the future it follows Moore's Law (take the conservative: double every 2 years) - 2008-2009: 22,004,934 - 1,943,722 = 20,061,212 remaining - 2010-2011: 20,061,212 - 3,887,444 = 16,173,768 - 2012-2013: 16,173,768 - 7,774,888 = 8,398,880 - 2014-2015: Finished!!!! ... sometime in 2015. So... "statistically speaking" 7 or 8 more years to finish up to 79,300,000 during which time we will find 2 more Mersenne Primes. I foresee more "dry years" on the horizon ... but don't let that discourage you. :flex: |
[QUOTE=Richard Cameron;122071]I meant largest (discovered) arithmetic gap in exponent. I agree that since we don't know the distribution of mersenne primes, we can't make much in the way of predictions beyond that.[/QUOTE]
Actually, the known data and plausible heuristics [mainly by Pomerance, Lenstra and [url=http://primes.utm.edu/notes/faq/NextMersenne.html]Wagstaff[/url]] indicate a pretty clear probability distribution, with the ratio of successive prime exponents averaging ~1.5, While the ratio for any two successive M-primes may be as close to 1 as one likes[sup]*[/sup] and [likely] as large as one likes, based on the known data and the aforementioned algebraic heuristics, it is also unlikely for the ratio between successive p to exceed [say] 4. Of course, given the current range and speed of the search, a factor of 4 would represent quite a long "dry spell", at least by the standards of Internet-DC-project time. === [sup]*[/sup][i]More precisely, as near unity as allowed by the requirement that if p prime and M(p) prime, then M(p+2) is the smallest next-candidate [and only if p+2 also prime, obviously][/i] |
[QUOTE=ewmayer;122082][sup]*[/sup][i]More precisely, as near unity as allowed by the requirement that if p prime and M(p) prime, then M(p+2) is the smallest next-candidate [and only if p+2 also prime, obviously][/i][/QUOTE]And if the twin-prime conjecture holds ...
Paul |
I think petrw1's estimate of Moore's Law holding for GIMPS is far too optimistic. Indeed, the data in his post goes against it.
[QUOTE]- in 2006 887,665 P90 years done - in 2007 971,861 P90 years done[/QUOTE] This is a growth rate of less than 10 percent. Assuming exponential growth, it would take over 7.6 years for speed to double. My experience, if I remember correctly, from reading Primenet's main page is that speed has been growing much more slowly (if at all) since late 2005. By the way, where did he get the data for the total amount of work done in 2006 and 2007? |
[QUOTE=ewmayer;122082]Actually, the known data and plausible heuristics [mainly by Pomerance, Lenstra and [url=http://primes.utm.edu/notes/faq/NextMersenne.html]Wagstaff[/url]] indicate a pretty clear probability distribution, with the ratio of successive prime exponents averaging ~1.5...[/QUOTE]
thanks for this. i had a quick look on the prime pages, but couldn't find this bit on the distribution of mersenne primes. Serves me right for posting at work. |
[QUOTE=jinydu;122113]I think petrw1's estimate of Moore's Law holding for GIMPS is far too optimistic. Indeed, the data in his post goes against it.
This is a growth rate of less than 10 percent. Assuming exponential growth, it would take over 7.6 years for speed to double. My experience, if I remember correctly, from reading Primenet's main page is that speed has been growing much more slowly (if at all) since late 2005. By the way, where did he get the data for the total amount of work done in 2006 and 2007?[/QUOTE] I considered adding a second scenario based on the 2006 to 2007 increase --- yes, much below Moore's law --- but didn't, mostly because I thought it was a pretty small sample size (one year) to predict the future on. As well there is a lot of buzz about the quads now so I am somewhat optomistic that in 2008 there will be a bigger increase. As to my source ... I have been printing the status pages ( [url]http://www.mersenne.org/status.htm[/url] ) every month since October 2005 as close to month end as George's updates permit. So I can tell you that on Oct 25, 2005 there were a little over 24 Million P90 years remaining; 23,864,460 as of Dec 28, 2005; 22,976,795 as of Dec 31, 2006 and 22,004,934 as of Dec 31, 2007. |
I also keep some status pages, here is my contribution:
[U]Date[/U]: [U]Yrs remaining[/U]: 03FEB2002 - 27,338,916 (the oldest I have) 01JAN2003 - 26,531,251 31DEC2003 - 25,701,037 10JAN2005 - 24,601,120 01JAN2006 - 23,848,134 It looks like we are essentially maintaining our rate over the years, but I think that is due to the fact that we are searching for factors at ever increasing bit levels, which means that much more work is needed to find them. This is particularly evident in LMH work, that accounts for the majority of the decrease in the "number of years remaining". So in fact we are investing a lot more power in the search. As the bit levels increase, though, I am not sure whether we will able to "keep the pace". As an example, going from 64 to 65 bits takes a long time, much more than the double of the time needed to go from 63 to 64 bits. Hence, when the LMHers start hitting that wall, the increase in computer power will probably not be sufficient to allow us to reduce the "years remaining" at the rate we are curently doing. |
Does the increase in factoring difficulty really make that much of a difference? I thought that the bulk of GIMPS' computing power is being devoted to LL testing.
Also, does this mean that the statistics do not accurately take into account the increased bit levels required for larger exponents? |
[QUOTE=Mini-Geek;122037]I doubt the first 10M-digit won't be Mersenne. The largest known prime has been a Mersenne prime since digital computers began, and currently the list of largest primes is solid Mersenne's up through number 6, and the next k*2^n+c is under 4M-digit. (No, correlation does not imply causation, but there is indeed a cause. i.e. the extremely efficient LL test.)
Technically, Mersenne's do fit into your k*2^n+c, with k=1 and c=-1, but I know that's not what you meant. Is there a specific DC project you think will find it? Are they testing many 10M-digit's? How long does it take for each number?[/QUOTE] I believe it's already been found, by someone in the prime community, someone who's proven themselves to be knowlegeable about this stuff. According to them the doublecheck has been running since the middle of December. When I say soon, I mean before Valentine's Day, which is February 14th. |
[QUOTE=jasong;122204]I believe it's already been found, by someone in the prime community, someone who's proven themselves to be knowlegeable about this stuff. According to them the doublecheck has been running since the middle of December.
When I say soon, I mean before Valentine's Day, which is February 14th.[/QUOTE] You have obtained inside information? I can't find anything on a new prime record from 2007. |
When GIMPS finds a residue of zero, primenet announces it
as "prime unverified" although they don't disclose the exponent. What is your source Jason? |
[QUOTE=jinydu;122187]Does the increase in factoring difficulty really make that much of a difference? I thought that the bulk of GIMPS' computing power is being devoted to LL testing.
[/QUOTE] And in fact it is. But in terms of "years remaining", the main decrease has been due to LMH, as the numbers eliminated by their work have been far larger than the ones eliminated by Primenet. Back in 2003, I was doing LMH work with a Pentium III 700 MHz, and it was going much faster than a Pentium M, or even an Athlon 64 2.2 MHz is today able to do. The bit levels back then were at 57 and 58! Trial factoring a number from 58 to 59 takes roughly 32 times less work than from 63 to 64. Even today, I test on average 140 exponents from 62 to 63 bits, in the 68M range, on my Pentium 4. I find on average 2 factors a day, which means roughly 60 to 65 P-90 CPU years. It is far more than what we can expected from LL tests!... As the gap between the leading edge of Primenet 1st time LL testing and the ranges worked by LMH narrows, this effect will be less apparent. But undoubtly the increase in the bit levels will hold back the throughput of LMH, and will be reflected in the "years remaining" figures, unless we get a significant increase in resources devoted to the LMH project |
I forgot about that. :blush:
Of course, trying to measure the actual amount of computing work done by GIMPS using the P90 CPU Years remaining figures from the status pages gives an overestimate. |
[quote=jasong;122204]I believe it's already been found, by someone in the prime community, someone who's proven themselves to be knowlegeable about this stuff. According to them the doublecheck has been running since the middle of December.
When I say soon, I mean before Valentine's Day, which is February 14th.[/quote] Although you profess to recognizing bullshit when you see it, I (for one) do not intend to proliferate this rumour on the strength of your say so. |
[quote=jasong;122204]I believe it's already been found, by someone in the prime community, someone who's proven themselves to be knowlegeable about this stuff. According to them the doublecheck has been running since the middle of December.
When I say soon, I mean before Valentine's Day, which is February 14th.[/quote] Three months for a doublecheck, which would certainly be run by the fastest computer available? Just how long does each candidate normally take, and how they were able to find one before GIMPS, when we have so many resources and can routinely run a number in a month? Do they have Blue Gene for everything except the double check? Assuming this isn't complete lies, perhaps the double check will finish on my b-day, which is just before Valentine's Day. If it's real, why wasn't it announced anywhere that one was discovered, and that a doublecheck was to be run, so that faster hardware could run it in under three months? |
[quote=jinydu;122223]I forgot about that. :blush:
Of course, trying to measure the actual amount of computing work done by GIMPS using the P90 CPU Years remaining figures from the status pages gives an overestimate.[/quote] The current throughput of 2000 P90 years per day works out at 730,000 P90 years per year. The average computer participating in GIMPS (there are ~73,000 of them) is only doing 10 times as much as a P90 working 24/7:sad: |
I've just calculated this:
Up to 40M 63% of prime exponent Mersennes factored. 50M-80M 57% factored 40M-50M 59.5% factored (GIMPS has reached~45M) This tells us how much trial factoring will help with the P90 years remaining. David |
[QUOTE=lycorn;122176]I also keep some status pages, here is my contribution:
[U]Date[/U]: [U]Yrs remaining[/U]: 03FEB2002 - 27,338,916 (the oldest I have) 01JAN2003 - 26,531,251 31DEC2003 - 25,701,037 10JAN2005 - 24,601,120 01JAN2006 - 23,848,134 It looks like we are essentially maintaining our rate over the years, but I think that is due to the fact that we are searching for factors at ever increasing bit levels, which means that much more work is needed to find them. This is particularly evident in LMH work, that accounts for the majority of the decrease in the "number of years remaining". So in fact we are investing a lot more power in the search. As the bit levels increase, though, I am not sure whether we will able to "keep the pace". As an example, going from 64 to 65 bits takes a long time, much more than the double of the time needed to go from 63 to 64 bits. Hence, when the LMHers start hitting that wall, the increase in computer power will probably not be sufficient to allow us to reduce the "years remaining" at the rate we are curently doing.[/QUOTE] Very linear indeed!!! .... It will be done in 2032 at this linear rate. |
[quote=petrw1;122441]Very linear indeed!!! .... It will be done in 2032 at this linear rate.[/quote]
But as Iycorn pointed out, the reason the decrease in 2002 was as big as in 2007 was due to much easier factoring by LMH in the ranges beyond GIMPS. As my figures show, we can expect GIMPS factoring to reduce the "status unknown" in the range 50M-80M from its current 43% down to 37% at which point further factoring becomes more expensive than LLtesting and checking. See [URL]http://mersenne.org/ips/stats.html[/URL] |
[QUOTE=davieddy;122454]But as Iycorn pointed out, the reason the decrease in 2002 was as
big as in 2007 was due to much easier factoring by LMH in the ranges beyond GIMPS. As my figures show, we can expect GIMPS factoring to reduce the "status unknown" in the range 50M-80M from its current 43% down to 37% at which point further factoring becomes more expensive than LLtesting and checking. See [URL]http://mersenne.org/ips/stats.html[/URL][/QUOTE] However, each higher exponent takes a little less time to factor to a specific limit. This should suggest that at the points that factoring is done to one more bit than those lower that a few more should find factors. BUT.... are you considering this and still noting that the percentage is dropping anyway? |
[quote=petrw1;122485]However, each higher exponent takes a little less time to factor to a specific limit. This should suggest that at the points that factoring is done to one more bit than those lower that a few more should find factors.
BUT.... are you considering this and still noting that the percentage is dropping anyway?[/quote] Here are the percentages of Mersennes factored by range: 0-15M: 63.54% 15M-17.5M: 62.59% 17.5M-20M: 62.87% 20M-25M: 63.04% 25M-30M: 62.67% 30M-35M: 62.64% 35M-40M: 62.41% It is hardly over-extrapolating to guess that from 40M-80M the percentage factored before LL testing takes place will be close to 62.5% (5/8). As your observations show, it is remarkable that GIMPS factoring limits happen to result in this constant fraction factored. David |
[quote=davieddy;122494]As your observations show, it is remarkable that GIMPS factoring
limits happen to result in this constant fraction factored. David[/quote] OTOH these figures represent a "narrow" range of exponents in a logarithmic sense: ~2^23 to 2^25. |
The intersting thing to point out about trial factoring
is tha(although it cuts down on LLtesting it doesn't affect the expected primes. There are less of them to test, but they are more likely to be prime. IMHO that is. David |
| All times are UTC. The time now is 23:28. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.