mersenneforum.org  

Go Back   mersenneforum.org > Fun Stuff > Lounge

Reply
 
Thread Tools
Old 2015-11-17, 16:23   #1
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

134648 Posts
Default Betting on Mersennes -- update #1

Five years ago I bet axn 25¢ that GIMPS wouldn't discover 8 Mersenne primes by Jan 1 2030. In that time we've discovered one new prime and moved the wavefront from 31M to 58M.

Following a simple exponential model, this suggests we'd hit the original expectation of 675M in mid-2034 (with the implicit Moore doubling every ~27 months). But having only found one prime so far, the expectation is now 884M.

Of course 2030 is still far off, and many technologies will be discovered between now and then. Quantum computation would be particularly disruptive: with Beauregard's circuit for Shor's algorithm, 'only' 1.8 billion qubits would be needed to factor all the Mersenne candidates up to the above bound, leaving (whp) just the primes for MM to verify. But I don't expect quantum computing to be practical by 2030 -- I'd be thrilled with ten thousand reliable, nondecohering qubits.

Quote:
Originally Posted by axn View Post
It will drop off from that list once GIMPS has found 8 more primes.
Quote:
Originally Posted by CRGreathouse View Post
Hmm. 8e^{-\gamma}\approx4.49, and two to that power is about 22.5, so that's expected to occur when we're working with Mersenne exponents about 22.5 times larger. If we're around 30 million right now, that would be 675 million.

Assuming (for simplicity) that the work needed to test an exponent is proportional to the square of the exponent and that all prime exponents are tested, that would require about 10,000 times the total effort of GIMPS to date. If half of that effort happened in the past two years, and GIMPS' computing power doubles every two years (by some combination of Moore's law and recruitment), this would require 2 lg(ln(2) * 20,001) or 27.5 years. Assuming instead that it doubles every two years for a decade and then holds constant, it would take about 2 * 20000/2^5 + 10 = 1260 years. So the timeline depends strongly on the assumptions made.
Quote:
Originally Posted by axn View Post
GPUs are the game changers. They will, in fact, keep pace with Moore's law, and bump the constants. I expect 2 decades (i.e before the end of 2030) for the necessary 8 primes to pop out.
Quote:
Originally Posted by CRGreathouse View Post
I'll bet you a (post-inflation) quarter* that we won't have the 8 by 2030.

I happily cede that they're game-changing, but I'd be surprised if their performance doubled even every three years after, say, 2025.

* With 2-3% inflation it would only be worth 13 to 17 cents in 2010 dollars by the time I'd be able to collect.
Quote:
Originally Posted by axn View Post
You're on
CRGreathouse is offline   Reply With Quote
Old 2015-11-18, 03:17   #2
LaurV
Romulan Interpreter
 
LaurV's Avatar
 
Jun 2011
Thailand

8,963 Posts
Default

Ha! This does not consider new theoretical developments, like "LaurV found a new algorithm for squaring and found the next 10 mersenne primes over night". Or, why bother with squaring, say "LaurV found a heuristic to pick the exponents that lead to mersenne primes, directly"

(or "LaurV found that the number of mersenne primes is bounded and we already found all of them, and we are wasting the time here in vain" )
LaurV is offline   Reply With Quote
Old 2015-11-18, 03:56   #3
axn
 
axn's Avatar
 
Jun 2003

10010101101102 Posts
Default

So far, CRG appears to be "winning". The DP performance from GPUs have stagnated or even regressed, especially from the NVidia camp. This doesn't appear to be a technological issue, so much as a market segmentation issue -- NVidia appears to be protecting its scientific computing market by gutting the consumer GPUs.

Something like a Titan coupled with HBM should be a killer LL GPU, but it doesn't look like anything like that will happen soon. HBM is going to eventually come to NVidia camp as well, but the DP story might not change.

Perhaps we should look at some NTT-based software for LL. Recently, there has been some significant development in the Generalized Fermat testing with this approach.

All in all, I'm still waiting for that game-changing technology to emerge
axn is offline   Reply With Quote
Old 2015-11-18, 05:06   #4
Batalov
 
Batalov's Avatar
 
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2

25·7·41 Posts
Default

If I were you, I would double down!
Batalov is offline   Reply With Quote
Old 2015-11-18, 17:08   #5
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

173416 Posts
Default

Quote:
Originally Posted by axn View Post
Perhaps we should look at some NTT-based software for LL. Recently, there has been some significant development in the Generalized Fermat testing with this approach.

All in all, I'm still waiting for that game-changing technology to emerge
This is a bet I would be very happy to lose! Practical quantum computers, game-changing technology, much-improved algorithms, etc. -- I'd love to see it.
CRGreathouse is offline   Reply With Quote
Old 2015-11-20, 23:42   #6
jasong
 
jasong's Avatar
 
"Jason Goatcher"
Mar 2005

1101101100012 Posts
Default

Quote:
Originally Posted by CRGreathouse View Post
This is a bet I would be very happy to lose! Practical quantum computers, game-changing technology, much-improved algorithms, etc. -- I'd love to see it.
I know it can be difficult to comprehend the idea of an entire society being spoiled rotten, but that's what the computing industry has done. From what I've heard most industries that can't be improved through computation only get about 2 to 3% more "awesome" per year, and that can fluctuate drastically depending on the industry. So with the food industry in the last couple hundred years, we've got food pre-cooked in cans and cereal and all sorts of awesomeness at the supermarket, and then we added things like health codes and the previously mentioned pre-cooking, since the food's gotten cheap enough that we can afford the extra awesomeness. So food's gotten quite a bit more awesome over the centuries, but it's improvement has been as a "normal" industry.

My point is that an industry doubling in "awesomeness" every 2-3 years, or even 5-10 years, is unheard of outside of electronics. Assuming I'm correct about the 2-3% improvement with most industries, and also assuming the electronics industry went through 25~ doublings since the transistor was invented, I wonder what we get if we solve the equation 1.03^x=2^25(33.5m~)?

Assuming I didn't flub the math, if the computing industry behaved like other industries, than the improvement over the past 50 years would've been about 12-16%

We truly live in fantastic times.

Edit:I'd forgotten about the genetics industry, although a lot of that simply involves automated processing, so it's at least loosely connected to the computing industry.

Last fiddled with by jasong on 2015-11-20 at 23:43
jasong is offline   Reply With Quote
Old 2016-03-22, 19:01   #7
Marius Titulesc
 
Mar 2016

2 Posts
Default

Indeed we do. It's hard to imagine how the future will be. I will go out on a limb and say we're not ready for it.
______________________________
Marius

Last fiddled with by Batalov on 2016-03-23 at 01:39 Reason: irrelevant URL links removed
Marius Titulesc is offline   Reply With Quote
Old 2016-03-30, 20:48   #8
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

22×33×5×11 Posts
Default

At the time of the original bet 47 Mersenne primes were known; the bet asks if 55 will be known by the start of 2030.

The retroactive discovery (two months before this thread) of 274,207,281-1 does tip the scales, so I thought I'd recalculate. Only six more Mersenne primes are needed, though four months have passed and the wavefront has pushed forward. With it currently at 63,350,927, the expected search depth needed for the next 6 Mersenne primes is 654,409,841 which means that
\[k=\frac{\sum_{p\le654409841}p^2\log p\log\log p}{\sum_{p\le63350927}p^2\log p\log\log p} \approx 1300\]
times the current effort should be needed to finish. This corresponds to a Moore doubling every 16 months, if I've computed correctly: implicitly define x by
\[\int_{2010}^{2030}x^tdt=k\int_{2010}^{2016.25}x^tdt\]
then computer speed needs to double every
\[\frac{12\log2}{\log x}\approx16\] months.
CRGreathouse is offline   Reply With Quote
Old 2016-03-30, 21:23   #9
science_man_88
 
science_man_88's Avatar
 
"Forget I exist"
Jul 2009
Dumbassville

26·131 Posts
Default

Quote:
Originally Posted by CRGreathouse View Post
Five years ago I bet axn 25¢ that GIMPS wouldn't discover 8 Mersenne primes by Jan 1 2030. In that time we've discovered one new prime and moved the wavefront from 31M to 58M.

Following a simple exponential model, this suggests we'd hit the original expectation of 675M in mid-2034 (with the implicit Moore doubling every ~27 months). But having only found one prime so far, the expectation is now 884M.

Of course 2030 is still far off, and many technologies will be discovered between now and then. Quantum computation would be particularly disruptive: with Beauregard's circuit for Shor's algorithm, 'only' 1.8 billion qubits would be needed to factor all the Mersenne candidates up to the above bound, leaving (whp) just the primes for MM to verify. But I don't expect quantum computing to be practical by 2030 -- I'd be thrilled with ten thousand reliable, nondecohering qubits.
that doesn't consider http://mersenneforum.org/showthread.php?t=21158 being true does it ?
science_man_88 is offline   Reply With Quote
Old 2016-03-31, 13:37   #10
CRGreathouse
 
CRGreathouse's Avatar
 
Aug 2006

22·33·5·11 Posts
Default

Quote:
Originally Posted by science_man_88 View Post
that doesn't consider http://mersenneforum.org/showthread.php?t=21158 being true does it ?
It doesn't, correct. If you like you can redo the calculations above with 4580686291 in place of 654409841.
CRGreathouse is offline   Reply With Quote
Old 2016-03-31, 13:41   #11
science_man_88
 
science_man_88's Avatar
 
"Forget I exist"
Jul 2009
Dumbassville

838410 Posts
Default

Quote:
Originally Posted by CRGreathouse View Post
It doesn't, correct. If you like you can redo the calculations above with 4580686291 in place of 654409841.
you know I'm not smart enough to do half the calculations you posted.
science_man_88 is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Primality testing non-Mersennes lukerichards Software 8 2018-01-24 22:30
GPU attack on double Mersennes? Uncwilly GPU Computing 29 2013-09-08 20:53
Betting on a coin toss Dougal Homework Help 10 2010-12-07 19:18
Stars and Mersennes David John Hill Jr Science & Technology 2 2009-12-13 09:47
Factoring Double mersennes Citrix Miscellaneous Math 2 2005-10-04 08:08

All times are UTC. The time now is 22:54.

Sat Dec 5 22:54:10 UTC 2020 up 2 days, 19:05, 0 users, load averages: 2.27, 1.92, 1.77

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.