![]() |
![]() |
#1 |
Sep 2013
Perth, Au.
2·72 Posts |
![]()
I calculated the chance of finding prime for Top 20 Conjectures with 1k Remaining by Highest and Lowest Conjectured k for their next 100k test ranges.
R620 has the highest chance, 11.42%. I've started testing it. Of course the higher the weight and higher the range, the longer to test. |
![]() |
![]() |
![]() |
#2 | |
May 2007
Kansas; USA
29DE16 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#3 |
Just call me Henry
"David"
Sep 2007
Liverpool (GMT/BST)
25·11·17 Posts |
![]() |
![]() |
![]() |
![]() |
#4 |
May 2007
Kansas; USA
2×23×233 Posts |
![]() |
![]() |
![]() |
![]() |
#5 | |
Sep 2013
Perth, Au.
9810 Posts |
![]() Quote:
http://www.noprimeleftbehind.net/cru...20.htm#Table25 As stated in my post I only calculated the probabilities based on these tables: - "Top 20 Conjectures with 1k Remaining by Highest Conjectured k": http://www.noprimeleftbehind.net/cru...0.htm#Table61; and, - "Top 20 Conjectures with 1k Remaining by Lowest Conjectured k: http://www.noprimeleftbehind.net/cru...20.htm#Table62. I've only spent a few hours looking at the CRUS website so I might not be optimally searching yet. This table looks more comprehensive: http://www.noprimeleftbehind.net/cru...s-unproven.htm I plan to start a 2k's search next, so maybe I'll base it on that table. Anyway, since you bought it up 32*702^n-1 has weight 2338, 78*916^n-1 has weight 2313. They are both tested to 100k. If you test R702 to 200k the chance of finding a prime and so proving the conjecture is 14.12%. If you test R916 to 200k the chance of finding a prime and so proving the conjecture is 13.43%. My calculations are based on the Prime Number Theory. I posted this method on PrimeGrid 6 months ago. No one's told me I am wrong (or right) yet: http://www.primegrid.com/forum_thread.php?id=5093 http://www.primegrid.com/forum_thread.php?id=4935 Why don't you add probabilities to the CRUS tables? If your looking for a result rely on probability. If your looking to see how quick you can test a range look at difficulty. Probability does not take account length of time to test an n or tests to be done in a range, just the chance you'll find a great result. |
|
![]() |
![]() |
![]() |
#6 | ||
May 2007
Kansas; USA
2×23×233 Posts |
![]()
Oh, OK. I'll take a look at those links later today. Your percentages are in line with what I would expect. We have an "odds of prime" spreadsheet with formulas created by one of the math experts here. I'll check them against that.
I think I misunderstood you previously. With this statement: Quote:
Quote:
So we are in agreement that R702 and R916 have a better chance of prime by n=200K. Regardless it doesn't matter to us what base you test. I just wanted you to realize that there are bases with a better chance of prime by n=200K than the one that you chose. Gary Last fiddled with by gd_barnes on 2013-09-29 at 18:37 |
||
![]() |
![]() |
![]() |
#7 |
May 2007
Kansas; USA
2·23·233 Posts |
![]()
I looked at your links and I don't know enough math to verify them one way or another.
According to the odds of prime spreadsheet attached, which I created based on formulas given by one of our math experts here (ID axn), here are the chance of prime percentages that I came up with for a sieve depth of P=5T, which is how far our 3 files have been sieved: Base / # tests / % chance Code:
R620 3875 19.5% R702 4896 23.7% R916 5216 24.2% I'm not sure why you are showing a less chance of prime for R916 vs. R702. With bases this high, the difference in base size has little impact on % chance of finding prime. For instance, if base R702 had 5216 tests like R916 does, R702 would have a 24.9% chance of prime (vs. 24.2% for R916) so you can see there is not a lot of difference in prime chance when a base is only 30% bigger than another one if all other things are equal. Edit: If you are using only a Nash weight to compute your chances of prime, that may explain the problem. Nash weight only works off of a sieve to P=511. Obviously a sieve to P=5T is going to be much more accurate. On our "difficulty" stats, our programming guru, Mark (rogue), uses a sieve to P=1M, which is very clearly accurate enough for determing such a stat. Last fiddled with by gd_barnes on 2013-09-29 at 23:24 Reason: edit |
![]() |
![]() |
![]() |
#8 |
Just call me Henry
"David"
Sep 2007
Liverpool (GMT/BST)
598410 Posts |
![]()
For a quick glance everything looks correct with your calculations. I will look to extend the odds of prime spreadsheet at somepoint with your ideas for ranges of n.
|
![]() |
![]() |
![]() |
#9 | |
Sep 2013
Perth, Au.
2×72 Posts |
![]() Quote:
The probabilities you show for R620, R702 and R916 are about x1.73 more than my values. That is significant, and would have shown up with all the crunching CRUS has done over the years by now for sure. That makes me go back to my initial assumptions and one I may have made in error is that base 2 has the same density of primes as a randomly chosen set of odd numbers of the same magnitude. I divide the Nash weight by 1751 to give w=1, but maybe I should be dividing by some other value 1751/1.73 = 1012 for instance. I am happy with the rest of the maths. So if I increase the sieve depth (to increase accuracy) and properly scale the weight my equation should be all good. From what you've said the spreadsheet only allows for an average n, so might only be accurate over small ranges requiring many calculations by parts to remain accurate. My equation is the result of an integration, so remains accurate over any range and doesn't require more than one calculation. Plus you can rearrange the equation and solve for other variables, which is really cool. I'll have to go over your odds of prime spreadsheet and see what's going on there. |
|
![]() |
![]() |
![]() |
#10 |
May 2007
Kansas; USA
2·23·233 Posts |
![]()
I agree that the spreadsheet needs some sort of integration (calculus) so that it is accurate over a wide n-range. I wouldn't begin to claim to now how that might be done, especially in Excel. The spreadsheet was originally designed for a large range of k with millions of candidates over a small range of n. For that it is highly accurate. Even with nmax/nmin = 2, it's not too far off.
The "calculation 1" and "calculation 2" formulas are the key. You may need to contact axn here to ask how he came up with them. There also might be a couple of people here at CRUS who might have insight into how they were derived. |
![]() |
![]() |
![]() |
#11 | |
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
7·13·47 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Odds of prime / expected # of primes | gd_barnes | Riesel Prime Search | 15 | 2010-10-14 22:00 |
Odds that a Random Prime is a Number? | R.D. Silverman | Homework Help | 60 | 2010-10-13 10:31 |
Odds that a random number is prime | Number theory | Homework Help | 4 | 2009-08-28 22:04 |
Odds of a prime number being random | Orgasmic Troll | Lounge | 6 | 2007-08-11 04:09 |
odds of a random prime being a number | ewmayer | Lounge | 10 | 2007-08-01 20:11 |