mersenneforum.org Betting on Mersennes -- update #1
 Register FAQ Search Today's Posts Mark Forums Read

2015-11-17, 16:23   #1
CRGreathouse

Aug 2006

173216 Posts
Betting on Mersennes -- update #1

Five years ago I bet axn 25¢ that GIMPS wouldn't discover 8 Mersenne primes by Jan 1 2030. In that time we've discovered one new prime and moved the wavefront from 31M to 58M.

Following a simple exponential model, this suggests we'd hit the original expectation of 675M in mid-2034 (with the implicit Moore doubling every ~27 months). But having only found one prime so far, the expectation is now 884M.

Of course 2030 is still far off, and many technologies will be discovered between now and then. Quantum computation would be particularly disruptive: with Beauregard's circuit for Shor's algorithm, 'only' 1.8 billion qubits would be needed to factor all the Mersenne candidates up to the above bound, leaving (whp) just the primes for MM to verify. But I don't expect quantum computing to be practical by 2030 -- I'd be thrilled with ten thousand reliable, nondecohering qubits.

Quote:
 Originally Posted by axn It will drop off from that list once GIMPS has found 8 more primes.
Quote:
 Originally Posted by CRGreathouse Hmm. $8e^{-\gamma}\approx4.49$, and two to that power is about 22.5, so that's expected to occur when we're working with Mersenne exponents about 22.5 times larger. If we're around 30 million right now, that would be 675 million. Assuming (for simplicity) that the work needed to test an exponent is proportional to the square of the exponent and that all prime exponents are tested, that would require about 10,000 times the total effort of GIMPS to date. If half of that effort happened in the past two years, and GIMPS' computing power doubles every two years (by some combination of Moore's law and recruitment), this would require 2 lg(ln(2) * 20,001) or 27.5 years. Assuming instead that it doubles every two years for a decade and then holds constant, it would take about 2 * 20000/2^5 + 10 = 1260 years. So the timeline depends strongly on the assumptions made.
Quote:
 Originally Posted by axn GPUs are the game changers. They will, in fact, keep pace with Moore's law, and bump the constants. I expect 2 decades (i.e before the end of 2030) for the necessary 8 primes to pop out.
Quote:
 Originally Posted by CRGreathouse I'll bet you a (post-inflation) quarter* that we won't have the 8 by 2030. I happily cede that they're game-changing, but I'd be surprised if their performance doubled even every three years after, say, 2025. * With 2-3% inflation it would only be worth 13 to 17 cents in 2010 dollars by the time I'd be able to collect.
Quote:
 Originally Posted by axn You're on

 2015-11-18, 03:17 #2 LaurV Romulan Interpreter     Jun 2011 Thailand 22×7×11×29 Posts Ha! This does not consider new theoretical developments, like "LaurV found a new algorithm for squaring and found the next 10 mersenne primes over night". Or, why bother with squaring, say "LaurV found a heuristic to pick the exponents that lead to mersenne primes, directly" (or "LaurV found that the number of mersenne primes is bounded and we already found all of them, and we are wasting the time here in vain" )
 2015-11-18, 03:56 #3 axn     Jun 2003 7×683 Posts So far, CRG appears to be "winning". The DP performance from GPUs have stagnated or even regressed, especially from the NVidia camp. This doesn't appear to be a technological issue, so much as a market segmentation issue -- NVidia appears to be protecting its scientific computing market by gutting the consumer GPUs. Something like a Titan coupled with HBM should be a killer LL GPU, but it doesn't look like anything like that will happen soon. HBM is going to eventually come to NVidia camp as well, but the DP story might not change. Perhaps we should look at some NTT-based software for LL. Recently, there has been some significant development in the Generalized Fermat testing with this approach. All in all, I'm still waiting for that game-changing technology to emerge
 2015-11-18, 05:06 #4 Batalov     "Serge" Mar 2008 Phi(4,2^7658614+1)/2 9,161 Posts If I were you, I would double down!
2015-11-18, 17:08   #5
CRGreathouse

Aug 2006

2×2,969 Posts

Quote:
 Originally Posted by axn Perhaps we should look at some NTT-based software for LL. Recently, there has been some significant development in the Generalized Fermat testing with this approach. All in all, I'm still waiting for that game-changing technology to emerge
This is a bet I would be very happy to lose! Practical quantum computers, game-changing technology, much-improved algorithms, etc. -- I'd love to see it.

2015-11-20, 23:42   #6
jasong

"Jason Goatcher"
Mar 2005

5×701 Posts

Quote:
 Originally Posted by CRGreathouse This is a bet I would be very happy to lose! Practical quantum computers, game-changing technology, much-improved algorithms, etc. -- I'd love to see it.
I know it can be difficult to comprehend the idea of an entire society being spoiled rotten, but that's what the computing industry has done. From what I've heard most industries that can't be improved through computation only get about 2 to 3% more "awesome" per year, and that can fluctuate drastically depending on the industry. So with the food industry in the last couple hundred years, we've got food pre-cooked in cans and cereal and all sorts of awesomeness at the supermarket, and then we added things like health codes and the previously mentioned pre-cooking, since the food's gotten cheap enough that we can afford the extra awesomeness. So food's gotten quite a bit more awesome over the centuries, but it's improvement has been as a "normal" industry.

My point is that an industry doubling in "awesomeness" every 2-3 years, or even 5-10 years, is unheard of outside of electronics. Assuming I'm correct about the 2-3% improvement with most industries, and also assuming the electronics industry went through 25~ doublings since the transistor was invented, I wonder what we get if we solve the equation 1.03^x=2^25(33.5m~)?

Assuming I didn't flub the math, if the computing industry behaved like other industries, than the improvement over the past 50 years would've been about 12-16%

We truly live in fantastic times.

Edit:I'd forgotten about the genetics industry, although a lot of that simply involves automated processing, so it's at least loosely connected to the computing industry.

Last fiddled with by jasong on 2015-11-20 at 23:43

 2016-03-22, 19:01 #7 Marius Titulesc   Mar 2016 216 Posts Indeed we do. It's hard to imagine how the future will be. I will go out on a limb and say we're not ready for it. ______________________________ Marius Last fiddled with by Batalov on 2016-03-23 at 01:39 Reason: irrelevant URL links removed
 2016-03-30, 20:48 #8 CRGreathouse     Aug 2006 2×2,969 Posts At the time of the original bet 47 Mersenne primes were known; the bet asks if 55 will be known by the start of 2030. The retroactive discovery (two months before this thread) of 274,207,281-1 does tip the scales, so I thought I'd recalculate. Only six more Mersenne primes are needed, though four months have passed and the wavefront has pushed forward. With it currently at 63,350,927, the expected search depth needed for the next 6 Mersenne primes is 654,409,841 which means that $k=\frac{\sum_{p\le654409841}p^2\log p\log\log p}{\sum_{p\le63350927}p^2\log p\log\log p} \approx 1300$ times the current effort should be needed to finish. This corresponds to a Moore doubling every 16 months, if I've computed correctly: implicitly define x by $\int_{2010}^{2030}x^tdt=k\int_{2010}^{2016.25}x^tdt$ then computer speed needs to double every $\frac{12\log2}{\log x}\approx16$ months.
2016-03-30, 21:23   #9
science_man_88

"Forget I exist"
Jul 2009
Dumbassville

8,369 Posts

Quote:
 Originally Posted by CRGreathouse Five years ago I bet axn 25¢ that GIMPS wouldn't discover 8 Mersenne primes by Jan 1 2030. In that time we've discovered one new prime and moved the wavefront from 31M to 58M. Following a simple exponential model, this suggests we'd hit the original expectation of 675M in mid-2034 (with the implicit Moore doubling every ~27 months). But having only found one prime so far, the expectation is now 884M. Of course 2030 is still far off, and many technologies will be discovered between now and then. Quantum computation would be particularly disruptive: with Beauregard's circuit for Shor's algorithm, 'only' 1.8 billion qubits would be needed to factor all the Mersenne candidates up to the above bound, leaving (whp) just the primes for MM to verify. But I don't expect quantum computing to be practical by 2030 -- I'd be thrilled with ten thousand reliable, nondecohering qubits.
that doesn't consider http://mersenneforum.org/showthread.php?t=21158 being true does it ?

2016-03-31, 13:37   #10
CRGreathouse

Aug 2006

2×2,969 Posts

Quote:
 Originally Posted by science_man_88 that doesn't consider http://mersenneforum.org/showthread.php?t=21158 being true does it ?
It doesn't, correct. If you like you can redo the calculations above with 4580686291 in place of 654409841.

2016-03-31, 13:41   #11
science_man_88

"Forget I exist"
Jul 2009
Dumbassville

202618 Posts

Quote:
 Originally Posted by CRGreathouse It doesn't, correct. If you like you can redo the calculations above with 4580686291 in place of 654409841.
you know I'm not smart enough to do half the calculations you posted.

 Similar Threads Thread Thread Starter Forum Replies Last Post lukerichards Software 8 2018-01-24 22:30 Uncwilly GPU Computing 29 2013-09-08 20:53 Dougal Homework Help 10 2010-12-07 19:18 David John Hill Jr Science & Technology 2 2009-12-13 09:47 Citrix Miscellaneous Math 2 2005-10-04 08:08

All times are UTC. The time now is 10:54.

Mon Nov 30 10:54:26 UTC 2020 up 81 days, 8:05, 3 users, load averages: 1.86, 1.53, 1.55