mersenneforum.org  

Go Back   mersenneforum.org > Prime Search Projects > Prime Gap Searches

Reply
 
Thread Tools
Old 2017-07-27, 19:52   #1
pinhodecarlos
 
pinhodecarlos's Avatar
 
"Carlos Pinho"
Oct 2011
Milton Keynes, UK

22·11·113 Posts
Default Prime Gap Searches Crowdfunding

Hi team,

Was wondering if we could create a Crowdfunding to buy some big machines or individual 4/8 core machines, don't know each option is the best, so members could run them as I think this is a long and interesting project. There's no way this can be "Boincfied" so why not try to fund members with computers being the electricity bill supported by the members?

What else can we do to grab people's attention to this project?
Any thoughts?

Carlos.

Last fiddled with by pinhodecarlos on 2017-07-27 at 19:56
pinhodecarlos is online now   Reply With Quote
Old 2017-07-27, 20:32   #2
danaj
 
"Dana Jacobsen"
Feb 2011
Bangkok, TH

32·101 Posts
Default

We're moving along with, wild guess, 1 month per 1e18 range? That means there are only ~12 more ranges to go before we need a substantial change in the software. On the other hand, we'll have covered all 64-bit n values, which is a big plus.

Perhaps off-topic, but a common question at Q/A sites seems to be "what is the biggest prime such that we know all the primes less than it." The question meanders off onto questions about "knowing", storage, primality test speed, etc. But realistically I believe TOeS's Goldbach project did calculate all the primes up to 4e18, and the prime gaps were a secondary result. He also stored prime counts at various intervals, which is relevant to that question. In contrast, this project does not do this -- it purposefully skips swaths of unknown numbers which probably include some primes (but we've determined that we wouldn't get a large enough gap regardless so we leave them unknown).
danaj is offline   Reply With Quote
Old 2017-07-27, 21:48   #3
rudy235
 
rudy235's Avatar
 
Jun 2015
Vallejo, CA/.

3·337 Posts
Default

Quote:
Originally Posted by danaj View Post
We're moving along with, wild guess, 1 month per 1e18 range? That means there are only ~12 more ranges to go before we need a substantial change in the software. On the other hand, we'll have covered all 64-bit n values, which is a big plus.

Perhaps off-topic, but a common question at Q/A sites seems to be "what is the biggest prime such that we know all the primes less than it." The question meanders off onto questions about "knowing", storage, primality test speed, etc. But realistically I believe TOeS's Goldbach project did calculate all the primes up to 4e18, and the prime gaps were a secondary result. He also stored prime counts at various intervals, which is relevant to that question. In contrast, this project does not do this -- it purposefully skips swaths of unknown numbers which probably include some primes (but we've determined that we wouldn't get a large enough gap regardless so we leave them unknown).
Dana One of the things I like about TOeS work is the counting of Prime twins. The π 2(x) Function. As we know TOeS work had these stored. He also used it to verify the Goldbach Conjecture up to 4 e18
This in turn was used to improve estimates of te Brun's constant B ≈1.902160583104
So, in answer to the common question "what is the biggest prime such that we know all the primes less than it." I would assume that is is very close to 4e 18 . (perhaps as low as 4 e 18 +1e9)
rudy235 is offline   Reply With Quote
Old 2017-07-27, 22:53   #4
danaj
 
"Dana Jacobsen"
Feb 2011
Bangkok, TH

38D16 Posts
Default

Quote:
Originally Posted by rudy235 View Post
So, in answer to the common question "what is the biggest prime such that we know all the primes less than it." I would assume that is is very close to 4e 18 . (perhaps as low as 4 e 18 +1e9)
If one believes Oliveira e Silva handled up to 4000e15, then it would be at least 4001e15 since I used primesieve to compute all the primes in that range. I did not store them however, just the gaps over 800. I'm not sure I still have those files.

But any number one comes up with can be trivially extended by a billion in less time than it takes to update the post, making it very fluid.

Anyway, it was a thought about one result from the earlier gap computations. The gapcoin project makes a number of claims, such as "[...] lead to new breakthroughs in the bounded gap, it may also help proving the Twin Prime Conjecture and maybe even the millennium problem, the Riemann hypothesis." which is stretching things pretty severely. Finding new jumping champions helps bound the absolute gap size for some practical computations. It's possible the results will lead to some insight or data about gap size conjectures. It drives development of open source number theory software, which can be used for other tasks.
danaj is offline   Reply With Quote
Old 2017-07-27, 23:17   #5
rudy235
 
rudy235's Avatar
 
Jun 2015
Vallejo, CA/.

3×337 Posts
Default

Quote:
Originally Posted by danaj View Post

But any number one comes up with can be trivially extended by a billion in less time than it takes to update the post, making it very fluid.
That is a truth bigger than any cathedral ever built.

I would agree that extending the range a la TOeS would have some practical effects, not those exaggerated claims by the gapcoin project to wit "Researches about prime gaps could not only lead to new breakthroughs in the bounded gap, it may also help proving the Twin Prime Conjecture and maybe even the millennium problem, the Riemann hypothesis"

But it would at the very least give useful insight on the distribution of Twin Primes, Cousin Primes Sexy Primes, Primes with Gaps of 30, 210, 2310 and other primorials, etc.
rudy235 is offline   Reply With Quote
Old 2017-07-28, 07:29   #6
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

11001001010012 Posts
Default

It would give numbers. I am far from convinced that it would give insight; having enumerations up to 2^64 instead of 2^62 burns an awful lot of natural gas in return for a tiny extension to a graph that analytic number theorists are absolutely confident of the shape of.

I would not approve 'push the limit up a bit' even were I a grant-giving body spending somebody else's money.

Last fiddled with by fivemack on 2017-07-28 at 07:30
fivemack is offline   Reply With Quote
Old 2017-07-28, 07:52   #7
danaj
 
"Dana Jacobsen"
Feb 2011
Bangkok, TH

90910 Posts
Default

Quote:
Originally Posted by fivemack View Post
It would give numbers. I am far from convinced that it would give insight; having enumerations up to 2^64 instead of 2^62 burns an awful lot of natural gas in return for a tiny extension to a graph that analytic number theorists are absolutely confident of the shape of.
I think I could agree with this. We're debating just how low the possibility of some insight is (my barrier to what "insight" means is also pretty low -- a bit more data to ponder and a very slight chance of showing one of the conjectures incorrect).

One thing that would be useful is to calculate remaining core-months or something. The point I was kind of trying to make earlier is that the whole thing may be over in a year. The amount of processing remaining is fairly low in some sense. The argument for continuing past 64-bit is more tenuous to me. It also has some significant programming challenges (e.g. we don't have a nice list of base-2 pseudoprimes already tabulated for us, just to name one). Those might be readily solved, perhaps not.

Quote:
I would not approve 'push the limit up a bit' even were I a grant-giving body spending somebody else's money.
Since we're on this forum, IMO this is more useful than GIMPS at this moment. I wouldn't allocate grant money on GIMPS either unless I were spending it directly on software. I can't really see analytic number theorists (contrasting with computational) spending money on this for any reason other than personal money for basically entertainment.

The Gapcoin project forum has regular discussion of how they want to get "math departments" interested in running Gapcoin mining software. It's a rather bizarre place, with a third of the people interested solely in ways to pump the price, a third who cannot form coherent thoughts but apparently the government is spying on them, and a third really wanting it to Help Science in some way while also mining coins.
danaj is offline   Reply With Quote
Old 2017-07-28, 08:38   #8
robert44444uk
 
robert44444uk's Avatar
 
Jun 2003
Oxford, UK

19·103 Posts
Default

My two pence worth.

Back to the original question - I would not be a contributor of a crowdfund - firstly I'm a pensioner with limited funds and secondly, one of the reasons I wanted to take up prime hunting was that it was basically a "free" hobby. That's not to say that "buying in" computer time is a bad thing, its just not for me.

I thought about the ToS project when thinking about this coordinated project, and there was a real temptation to consider an extension of the same. Decided against it in the end because I was primarily interested in first occurrence large gaps.

In terms of the return per effort - this project is basically rather thin gruel compared to others. So beyond 2^64 will be a tough decision. I feel for Steve Cole who has completed 850e15 with just one record, and that being one which may get taken out by a smaller candidate pair of primes. We have been pretty lucky in the 5e18 range, I am sure we will find 6e18 harder. I'm hopeful that there is a large gap in there somewhere which will emulate the 1132 find. And that is what drives me on.
robert44444uk is offline   Reply With Quote
Old 2017-07-28, 10:16   #9
Antonio
 
Antonio's Avatar
 
"Antonio Key"
Sep 2011
UK

32·59 Posts
Default

Quote:
Originally Posted by robert44444uk View Post
My two pence worth.

Back to the original question - I would not be a contributor of a crowdfund - firstly I'm a pensioner with limited funds and secondly, one of the reasons I wanted to take up prime hunting was that it was basically a "free" hobby. That's not to say that "buying in" computer time is a bad thing, its just not for me.
+1
I started gap searching as a way of learning to program in Perl, and I have enjoyed the search for large prime gaps aspect. As a non-programmer and non-mathematician it has all been very instructive.
As I only have the one computer (I don't count my laptop - they are not intended for heavy continuous use) I am already debating if I should go back to my original large gap search.
Antonio is offline   Reply With Quote
Old 2017-07-28, 10:20   #10
henryzz
Just call me Henry
 
henryzz's Avatar
 
"David"
Sep 2007
Cambridge (GMT/BST)

5,923 Posts
Default

Quote:
Originally Posted by pinhodecarlos View Post
Hi team,

Was wondering if we could create a Crowdfunding to buy some big machines or individual 4/8 core machines, don't know each option is the best, so members could run them as I think this is a long and interesting project. There's no way this can be "Boincfied" so why not try to fund members with computers being the electricity bill supported by the members?

What else can we do to grab people's attention to this project?
Any thoughts?

Carlos.
What's stopping boincification?
henryzz is online now   Reply With Quote
Old 2017-07-28, 16:47   #11
R. Gerbicz
 
R. Gerbicz's Avatar
 
"Robert Gerbicz"
Oct 2005
Hungary

27278 Posts
Default

Quote:
Originally Posted by fivemack View Post
It would give numbers. I am far from convinced that it would give insight; having enumerations up to 2^64 instead of 2^62 burns an awful lot of natural gas in return for a tiny extension to a graph that analytic number theorists are absolutely confident of the shape of.
And then what is your opinion about the Gimps project? (from the Gimps effort you can exclude the ecm factoring on smallish Mersenne/Fermat numbers)

I'd say here the biggest goals: find maximal gaps, already achieved, but not proved, if it would not be a maximal gap, then we find another (smaller) maximal gap. Not a small success, in the previous 8 years there was no such find. Even larger goal: find a new record for g_n/log(p_n)^2, where g_n=p(n+1)-p(n).

Quote:
Originally Posted by fivemack View Post
I would not approve 'push the limit up a bit' even were I a grant-giving body spending somebody else's money.
Agree, but we don't need it, with the latest code all previous calculation (before our project) could be done in 3 years on a core-i7, just for a comparison: https://cs.uwaterloo.ca/journals/JIS...ly/nicely2.pdf Nicely pushed the limit from 1e15 to 50e15 using eighty systems in 5 calendar years, we have a somewhat faster code (you can also factor out the computer's speed).

The 64 bits limitation is a real bottleneck, though still not there. You can't tell a lot of number theory problem that has been solved up to 2^64 and the difficulty is ~N.

Last fiddled with by R. Gerbicz on 2017-07-28 at 17:00 Reason: small math correction
R. Gerbicz is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Generalized Cullen and Woodall Searches rogue And now for something completely different 40 2020-12-08 13:21
Can someone run a couple of poly searches for me? schickel Msieve 12 2012-05-25 03:45
Searches and defaults Nelson Forum Feedback 18 2010-07-17 19:01
Thoughts on future NPLB searches em99010pepe No Prime Left Behind 38 2008-12-23 10:02
Conjectures 'R Us; searches needed gd_barnes Conjectures 'R Us 41 2008-01-18 21:47

All times are UTC. The time now is 11:46.


Tue Oct 26 11:46:54 UTC 2021 up 95 days, 6:15, 0 users, load averages: 2.98, 2.84, 2.46

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.