[QUOTE=gophne;476557]Hi jnml
Don't be so harsh on yourself. I know the feeling too.[/QUOTE] > Quote: > Originally Posted by gophne > 5) Running the suspect algorithm for false negatives up to 1,000,000 none was found > This has been confirmed unintentionally by one of the senior contributors on the site that > was running the algorithm in Pari, I think check some of the earlier posts. > Fermat's Primality check has a problem with false negatives (Carmichael Numbers). Please link that post you're talking about. Making claims that no one can verify is a typical crackpot tactic. Don't be a crackpot. 
Maybe this will help as well
[url]http://www.mersenneforum.org/showthread.php?t=14901[/url]

[QUOTE=gophne;476516]
Danaj, did you read the link/paper that you posted. If you have, which i doubt, I think you simply "googled" a paper which you thought might support your argument.[/QUOTE] As my text indicated, I did read it. I spent many nights after work looking at the paper and a white board, trying to figure out how to speed up my SoE. I put a reference to the paper in my project documentation in 2012. Github backs me up on this. I was mainly pointing you to their matrices looking at arrangements of primes/composites/squares. I did a crapton of that on my whiteboard trying to figure out how to write code to exploit it. Which makes me really impressed with some of the people who internalize this more. 
[QUOTE=jnml;476555]Please link that post you're talking about, I'm not sure which you mean, thanks.[/QUOTE]
Hi jnml I was referring to your post #60 it seems.  0% negatives 
[QUOTE=jnml;476567]> Quote:
> Originally Posted by gophne > 5) Running the suspect algorithm for false negatives up to 1,000,000 none was found > This has been confirmed unintentionally by one of the senior contributors on the site that > was running the algorithm in Pari, I think check some of the earlier posts. > Fermat's Primality check has a problem with false negatives (Carmichael Numbers). Please link that post you're talking about. Making claims that no one can verify is a typical crackpot tactic. Don't be a crackpot.[/QUOTE] Hi jnml Post #60 
[QUOTE=science_man_88;476568][url]http://www.mersenneforum.org/showthread.php?t=14901[/url][/QUOTE]
Hi science_man_88 Thanks for the link....the way things are going, soon I am going to become very math literate, and as a consequence, much more of a gentleman and a scholar! 
[QUOTE=gophne;476581]Hi jnml
I was referring to your post #60 it seems.  0% negatives[/QUOTE] [QUOTE=gophne;476582]Hi jnml Post #60[/QUOTE] [QUOTE=Batalov;475223] What is even more impressive is that all composite Mersenne numbers are 2PRPs (even if you don't know it)[/QUOTE] Read my lips. [B]Every[/B] 2^p1 will be called prime by this method. You could just as well use this test in its equivalent form:[B]Input: 2^p1. Return: prime![/B] That is why nobody uses 2PRP method on Mersenne numbers. You get no new information. You run some calculation for some time and get a result: prime! My great test is so much better: [B]Input: 2^p1. Return: prime![/B] Do you understand? Or do you need this repeated to you by five more people? They will! 
[QUOTE=gophne;476582]
Post #60[/QUOTE] That's not Pari code, that's Go code. No wonder I was not able to find it. However, your interpretation of the results is mistaken. The "algorithm" simply clasified [I]every single candidate exponent in the range tested[/I] (below 10.000 BTW, not 1.000.000) as a probable prime, ie. the code demonstrated it's completely useles. However, the side effect is that such "test" really has no false negatives, admitted. Crosspoted with #201 above. 
[QUOTE=gophne;476581]Hi jnml
I was referring to your post #60 it seems.  0% negatives[/QUOTE] [YOUTUBE]M8xlOm2wPAA[/YOUTUBE] 
[QUOTE=Batalov;476588]Read my lips.
[B]Every[/B] 2^p1 will be called prime by this method. You could just as well use this test in its equivalent form:[B]Input: 2^p1. Return: prime![/B] That is why nobody uses 2PRP method on Mersenne numbers. You get no new information. You run some calculation for some time and get a result: prime! My great test is so much better: [B]Input: 2^p1. Return: prime![/B] Do you understand? Or do you need this repeated to you by five more people? They will![/QUOTE] Noted. Give me some time to digest what you say. Regards 
[QUOTE=jnml;476591]That's not Pari code, that's Go code. No wonder I was not able to find it.
However, your interpretation of the results is mistaken. The "algorithm" simply clasified [I]every single candidate exponent in the range tested[/I] (below 10.000 BTW, not 1.000.000) as a probable prime, ie. the code demonstrated it's completely useles. However, the side effect is that such "test" really has no false negatives, admitted. Crosspoted with #201 above.[/QUOTE] I was about to comment that you must mean "no false negatives" and not " no false positives", which you have now corrected. 
[QUOTE=Batalov;476588]Read my lips.
[B]Every[/B] 2^p1 will be called prime by this method. You could just as well use this test in its equivalent form:[B]Input: 2^p1. Return: prime![/B] That is why nobody uses 2PRP method on Mersenne numbers. You get no new information. You run some calculation for some time and get a result: prime! My great test is so much better: [B]Input: 2^p1. Return: prime![/B] Do you understand? Or do you need this repeated to you by five more people? They will![/QUOTE] Hi Batalov The algorithm does not return "positive" for all mersenne numbers 2^n1 Algorithm: 2^n1 mod (n+2) = (n+1)/2...if true [B](n+2) is prime[/B], else composite, [I]barring false positives :( [/I] Check n = 5, 7, 9, 11 & 13...checking primality for [B](n+2)[/B] n=05........2^n1 mod (n+2) = 0031 mod [B]07[/B] = 03 ........= (n+1)/2..........True, hence (n+2) is Prime n=07........2^n1 mod (n+2) = 0127 mod [B]09[/B] = 01 ........<> (n+1)/2........False, hence (n+2) is Composite n=09........2^n1 mod (n+2) = 0511 mod [B]11[/B] = 05 ........= (n+1)/2..........True, hence (n+2) is Prime n=11........2^n1 mod (n+2) = 2047 mod [B]13[/B] = 06 ........= (n+1)/2..........True, hence (n+2) is Prime n=13........2^n1 mod (n+2) = 8191 mod [B]15[/B] = 01 ........= (n+1)/2..........False, hence (n+2) is Composite 
[QUOTE=gophne;476610]Hi Batalov
The algorithm does not return "positive" for all mersenne numbers 2^n1 Algorithm: 2^n1 mod (n+2) = (n+1)/2...if true [B](n+2) is prime[/B], else composite, [I]barring false positives :( [/I] Check n = 5, 7, 9 & 11...checking primality for (n+2) n=05........2^n1 mod (n+2) = 0031 mod [B]07[/B] = 03 ........= (n+1)/2..........True, hence (n+2) is Prime n=07........2^n1 mod (n+2) = 0127 mod [B]07[/B] = 01 ........<> (n+1)/2........False, hence (n+2) is Composite n=09........2^n1 mod (n+2) = 0511 mod [B]11[/B] = 05 ........= (n+1)/2..........True, hence (n+2) is Prime!!!!!..... [B][I]False Prime[/I][/B] n=11........2^n1 mod (n+2) = 2047 mod [B]13[/B] = 06 ........= (n+1)/2..........True, hence (n+2) is Prime!!!!![/QUOTE] 11 is a prime. Check the case n= 339 
[QUOTE=gophne;476610]Hi Batalov
The algorithm does not return "positive" for all mersenne numbers 2^n1[/QUOTE] Batalov's claim is that if n+2 = 2^p  1, your algorithm returns "prime" regardless of whether 2^p  1 is prime or composite. If n+2 is composite. For example, set n = 2045 and check that the test is passed 2^n1 mod (n+2) =? (n+1)/2 2^2045  1 mod 2047 =? 1023 1023 = 1023 but note that 2047 = 23*89. 
[QUOTE=science_man_88;476611]11 is a prime. Check the case n= 339[/QUOTE]
Hi science_man _88 Sorry "11" was a logic error, I corrected my post for that subsequently. Correct. For n=341 and (n+2) =341 returns as a false prime (1st false prime of the algorithm). There are 750 false primes up to 10,000,000 as per SAGEMATH code that I ran...[result subject to confirmation by others] Sorry for the mistake, due to cut & paste. 
[QUOTE=CRGreathouse;476617]Batalov's claim is that if n+2 = 2^p  1, your algorithm returns "prime" regardless of whether 2^p  1 is prime or composite. If n+2 is composite. For example, set n = 2045 and check that the test is passed
2^n1 mod (n+2) =? (n+1)/2 2^2045  1 mod 2047 =? 1023 1023 = 1023 but note that 2047 = 23*89.[/QUOTE] Hi CRGreathouse That is correct. the algorithm returns 750 false primes up to 10,000,000 according to SAGEMATH code I ran (Subject to confirmation by others) The 1st false prime for the algorithm (and Fermat's Primality Test, base 2) occurs at 341. 2047 also returns a false positive as you point out. 
[QUOTE=gophne;476619]Hi CRGreathouse
That is correct. the algorithm returns 750 false primes up to 10,000,000 according to SAGEMATH code I ran (Subject to confirmation by others) The 1st false prime for the algorithm (and Fermat's Primality Test, base 2) occurs at 341. 2047 also returns a false positive as you point out.[/QUOTE] And how many does Fermat's return ... 
Gophne, as actual proofs don't seem to have persuaded you, let's have a look at an example to see that the two tests are the same. I'm going to avoid modular arithmetic as much as possible because it seems to have caused a bit of confusion.
Suppose we are trying to test 561 for primality, so n = 559 in your test. Fermat's test declares 561 prime. This is because 2^560 ≡ 1 (mod 561), i.e. 2^560 = 561m + 1 for some m. Note that m is odd (it must be odd because the LHS is even, but you can also see this by calculating it directly if you wish), so m = 2k+1 for some k. Therefore 2^560 = 561(2k+1) + 1 = 1122k + 562. Divide both sides by 2: 2^559 = 561k + 281. Subtract 1: 2^559  1 = 561k + 280 So "(2^559  1) mod 561 = 280" (quotation marks because this is bad notation), and 280 = (559+1)/2 so your test declares 561 prime. Similarly, Fermat primality => gophne primality in general. Going the other way: Your test declares 561 prime. This is because (559+1)/2 = 280 and "(2^559  1) mod 561 = 280", i.e. 2^559  1 = 561k + 280 for some k. Add 1: 2^559 = 561k + 281. Multiply by 2: 2^560 = 1122k + 562 = 561(2k+1) + 1 So 2^560 = 561m + 1 for some m, i.e. 2^560 ≡ 1 (mod 561), and Fermat's test declares 561 prime. Similarly, gophne primality => Fermat primality in general, so they are in fact the same test. To turn these examples into a proper proof, just replace 559 with n, and remember that we are carrying out the primality tests on n+2 not n. 
[QUOTE=science_man_88;476620]And how many does Fermat's return ...[/QUOTE]
I have not run that myself, but I suspect the same based on the analysis [B]10metreh[/B] has done previously. 
[QUOTE=gophne;476650]I have not run that myself, but I suspect the same based on the analysis [B]10metreh[/B] has done previously.[/QUOTE]
Okay so what is your sticking point? Is it that there's many ways to select that many from the set of natural numbers in that interval ? 
[QUOTE=10metreh;476631]Gophne, as actual proofs don't seem to have persuaded you, let's have a look at an example to see that the two tests are the same. I'm going to avoid modular arithmetic as much as possible because it seems to have caused a bit of confusion.
Suppose we are trying to test 561 for primality, so n = 559 in your test. Fermat's test declares 561 prime. This is because 2^560 ≡ 1 (mod 561), i.e. 2^560 = 561m + 1 for some m. Note that m is odd (it must be odd because the LHS is even, but you can also see this by calculating it directly if you wish), so m = 2k+1 for some k. Therefore 2^560 = 561(2k+1) + 1 = 1122k + 562. Divide both sides by 2: 2^559 = 561k + 281. Subtract 1: 2^559  1 = 561k + 280 So "(2^559  1) mod 561 = 280" (quotation marks because this is bad notation), and 280 = (559+1)/2 so your test declares 561 prime. Similarly, Fermat primality => gophne primality in general. Going the other way: Your test declares 561 prime. This is because (559+1)/2 = 280 and "(2^559  1) mod 561 = 280", i.e. 2^559  1 = 561k + 280 for some k. Add 1: 2^559 = 561k + 281. Multiply by 2: 2^560 = 1122k + 562 = 561(2k+1) + 1 So 2^560 = 561m + 1 for some m, i.e. 2^560 ≡ 1 (mod 561), and Fermat's test declares 561 prime. Similarly, gophne primality => Fermat primality in general, so they are in fact the same test. To turn these examples into a proper proof, just replace 559 with n, and remember that we are carrying out the primality tests on n+2 not n.[/QUOTE] Hi 10metreh Now how was I expected to know/understand that! But it all make sense again, and I yet again must (for the last time, I promise) agree that the two algorithms are the same as you have explained again...(gnashing of teeth). Just a few questions to you that I would like you to honestly answer please. 1) How many people out there would have been able to do the reduction that you have just done? I suspect very few, and very few ppl would be able to even understand what you have done right now, eccept of course math people that have studied these things assiduously. 2) How was Fermat's Little Theorem/Fermat's Primality Test "derived". I am asking out of interest, if you know. 3) Having now finally accepted that the two theorems are the same because of your explaination/reductions (and awmayer I think), would you agree that by using the algorithm in the form of the [I]mersenne number modulo [/I] is a very interesting way to do it, [B]since it establishes a direct relationship between mersenne numbers and ALL prime numbers the hidden pattern of Prime numbers.....[I]surely the holy grail everybody in prime exploration was looking for[/I].[/B] Was the Fermat Little theorem/Primality test ever presented or investigated in this way? Can you give some references (or other contributors could as well if the have the info)? 4) Do you agree that the "mersenne number" way of expressing then the Fermat's Little theorem, base 2 could also help with factorization of mersenne numbers, something the standard form of the Fermat Primality Test does not do. This is because when you are testing the mersenne numbers modulo vs the odd numbers, the [I]remainders[/I] of the modulo operation is cyclical, starting with 1 and terminating with (n+1)/2, when the odd number is prime or a false prime I suspect as well. When a particular odd number is a factor of a particular mersenne number, the the 0 remainder, falls at the centre of the modulo cycle that ends with (n+1)/2. This could be algorithmized into some sort of [I]checker for mersenne number compositeness[/I]. 5) Lastly, as you have shown again so eloquently, that the "mersenne form" algorithm, is one and the same as the Fermat Primality Test, why did some [I]expert ppl[/I] at first crack up the"algorithm" as "nonsense". Are they saying that Fermat's Primality Test is nonsense (or did they not know what you know?) Thank you (and [B]awmayer[/B] in the first place) for your definitive explanation. 
[QUOTE=science_man_88;476651]Okay so what is your sticking point? Is it that there's many ways to select that many from the set of natural numbers in that interval ?[/QUOTE]
Nope it is clear, check [B]10metreh's[/B] post #212 
[QUOTE=gophne;476654]Nope it is clear, check [B]10metreh's[/B] post #212[/QUOTE]
Wow! This is almost as painful as watching Trump implode. 
[QUOTE=chalsall;476655]Wow! This is almost as painful as watching Trump implode.[/QUOTE]
Hi chalsall How is it similar?...are you comparing me to President Trump/ or are you feelling pain at President Trump's implosion, or both? Very interesting, please elucidate a little bit more. 
I'm not 10metreh but I'll take a whack at it.
[QUOTE=gophne;476653]1) How many people out there would have been able to do the reduction that you have just done? I suspect very few, and very few ppl would be able to even understand what you have done right now, eccept of course math people that have studied these things assiduously.[/QUOTE] I would hope that (almost) everyone with a degree in mathematics could do it. US universities grant [url=https://nces.ed.gov/programs/digest/d16/tables/dt16_322.10.asp?current=yes]~20,000[/url] bachelor's degrees per year in mathematics and statistics, so maybe on the order of a million people in the US. Of course some people with such degrees couldn't do it, and some people without degrees could, but probably not as few as 100,000 or as many as 10 million in the US. Worldwide, maybe 10 times as many? [QUOTE=gophne;476653]3) Having now finally accepted that the two theorems are the same because of your explaination/reductions (and awmayer I think), would you agree that by using the algorithm in the form of the [I]mersenne number modulo [/I] is a very interesting way to do it, [B]since it establishes a direct relationship between mersenne numbers and ALL prime numbers the hidden pattern of Prime numbers.....[I]surely the holy grail everybody in prime exploration was looking for[/I].[/B][/QUOTE] I would not, because you could use any base, not just base 2. Instead of getting [url=https://oeis.org/A001567]A001567[/url] = 341, 561, 645, 1105, ... as exceptions, you'd get [url=https://oeis.org/A005935]A005935[/url] = 91, 121, 286, 671, 703, 949, .... (base 3), [url=https://oeis.org/A005936]A005936[/url] = 4, 124, 217, 561, 781, 1541, 1729, ... etc. as exceptions. (We refer to them as pseudoprimes, actually.) [QUOTE=gophne;476653]4) Do you agree that the "mersenne number" way of expressing then the Fermat's Little theorem, base 2 could also help with factorization of mersenne numbers[/QUOTE] I don't see an advantage to either form. Regardless, it's an interesting question as to whether Fermat tests (or more likely, Miller tests) could be used to factor numbers. In general I don't see an efficient way to do this, but if you can get some composite to pass many (say, 2 to 20) Fermat/Miller tests, you could probably find a factorization of the number. (It works especially well on Carmichael numbers, where I think you can get polynomial expected time factorizations.) This is worth studying! [QUOTE=gophne;476653]5) Lastly, as you have shown again so eloquently, that the "mersenne form" algorithm, is one and the same as the Fermat Primality Test, why did some [I]expert ppl[/I] at first crack up the"algorithm" as "nonsense". Are they saying that Fermat's Primality Test is nonsense (or did they not know what you know?)[/QUOTE] Reviewing the first few pages of this thread, it seems like no one was calling your algorithm nonsense. Rather, they thought that your claims were unlikely, to wit: [QUOTE=gophne;474993]The algorithm is very clear. ALL primes are covered tested to M34 2^1,257,7871, Seriously! ALL composites are identified! Time O(log n) Very very fast. Serious. ALL primes are defined NO false primes expected as per algorithm logic. The algorith is a logical formulation, so can be verified quickly. If authenticated, will compare to simplicity of Euclid's Proof for Infinite number of Primes. If true, will be a new frontier in PNT[/QUOTE] Let's score these. I think your algorithm is pretty clear. You first defined it, I think, here: [QUOTE=gophne;475101]2^n1 mod (n+2)= 1/2*(2^n1)+1[/QUOTE] where the only wrinkle is that you hadn't yet mentioned that it was a test for n+2 rather than n. All primes are covered. You hadn't tested it up to 2^1257787  1, though, which you did have the courage to admit in post #36. Not all composites are identified. It doesn't run in O(log n), it's Omega(log^2 n). The algorithm can't be verified at all (let alone quickly) because it doesn't perform as advertised. I'll give you your claim on simplicity  it's an elegant method. Your final claim is vacuously true... it didn't authenticate, so there's no requirement that it be a new frontier. All in all, I think the skepticism was warranted. No doubt the skepticism could have been expressed more kindly though; sometimes we're not as welcoming as we'd like to be. :down: 
[QUOTE=gophne;476653]2) How was Fermat's Little Theorem/Fermat's Primality Test "derived". I am asking out of interest, if you know.
[/QUOTE] See [URL]http://www.mersenneforum.org/showthread.php?t=21737[/URL] and [URL]http://www.mersenneforum.org/showthread.php?t=21756[/URL] 
CRGreathouse has answered most of the questions so there isn't much more for me to say.
[QUOTE=gophne;476653]2) How was Fermat's Little Theorem/Fermat's Primality Test "derived". I am asking out of interest, if you know. [/QUOTE] According to the [URL="http://primes.utm.edu/glossary/xpage/Fermat.html"]Prime Pages[/URL], Fermat noticed that 2p divides 2^p2 for p prime while studying perfect numbers, which are of course closely related to Mersenne primes. I don't think we have any record of how he proved his theorem. Nick has given links to proofs if you want to see those. [quote]4) Do you agree that the "mersenne number" way of expressing then the Fermat's Little theorem, base 2 could also help with factorization of mersenne numbers, something the standard form of the Fermat Primality Test does not do.[/quote] Neither form really has an advantage over the other in base 2, because the two forms are equivalent: you can get very quickly from one to the other as I showed in my last post. However, it is harder to generalise your form to bases other than 2. The reason your form works is that (n+1)/2 is an integer for any n that is not a multiple of 2. But if we change 2 to 3, this is no longer true. [quote]This is because when you are testing the mersenne numbers modulo vs the odd numbers, the remainders of the modulo operation is cyclical, starting with 1 and terminating with (n+1)/2, when the odd number is prime or a false prime I suspect as well. When a particular odd number is a factor of a particular mersenne number, the the 0 remainder, falls at the centre of the modulo cycle that ends with (n+1)/2. This could be algorithmized into some sort of checker for mersenne number compositeness.[/quote] As far as I can tell, you're just saying that if you calculate 2^n1 mod every odd number in turn, and you find that 2^n1 ≡ 0 mod m for some odd number m, then m is a factor of 2^n1. This is a very slow way of factoring Mersenne numbers  it's just trial division, and it's not even efficient trial division. 
My thought was work backwards as though the 2 in n+2 was n+1 and solve the congruence to be 0.

Equivalents generalize to:
2^(n1)==1 mod n Subtract 2^y or some natural y≤n 2^(n1)2^y == n(2^y1) mod n Divide by 2^y 2^(ny1)1 == (n(2^y1))/(2^y) mod n Replace n by n+2^y 2^(n+2^yy1)1== (n+1)/(2^y) mod (n+2^y) Unless I messed up a step.edit: or the statement for Fermat's theorem like I did originally. Edit 2: I think you may find it can be changed still to a modular arithmetic statement about double mersennes with 2 more steps though. 
Nth Value Primality Algorithm
Hi Everybody
Time for playing with another possible new/unusual ? Primality Algorithm on the [I]mersenneforum.org>Extra Stuff>Blogorrhea>gophne>"New"primality test/check Forum.[/I] Not sure if anybody would be interested, but at risk of missing out on a possible "new" primality test/algorithm, which is the equivalent to missing out on M50 The Algorithm is as follows; For the series of odd numbers, where the 1st term of the series is 1, to the [I]nth term VALUE[/I], increment 2, step 1, where n is an element of the set of integers =>1, and the nth term results in the series having an odd number of terms, In sigma notation the series would look something like this: nth (VALUE) Z (2n1) [I]Z representing Sigma[/I] in lieu of n=1 [B]Then for any of the terms of the series so constituted, starting at the 2nd term to the term whose value would be equal to the value of the [I]nth value[/I] 2. If any of the terms in the test range is a multiple of, or shares a common factor with, the [I]nth value[/I], then the [I]nth value[/I] is Composite, else Prime. [/B] To do an example just so to clarify the wording (I am not familiar with Latex on this Site yet); Let the [I]nthvalue[/I] be equal to 7, then the algorithm series, applying Sigma, would be; 01,03,05,07 The test "range" according to the algorithm would be; 03,05......................................................Constituting the "seed" terms in the test range. 03 + 14 =17 05 + 14 =19 Since both these derived values are not multiples, or do not share a common factor with any of the "seed" terms, the [I]nth value[/I] , namely "7" is Prime, as per the algorithm. Let us try [I]nth value[/I] is equal to 2047 The test range would be; [B]03,05,07,09,11,13,15.......2045[/B].........Test range/Seed terms For this test range the derived algorithm values would be [B][I]seed terms[/I] + 4094[/B] The Seed/Algorithm pairs formed would be; (3,34097) (5,4099),(7,5001),(9,5003), (11, 5005), (13,5007)...... But the 5th term of these pairings share a common factor "11", so according to the algorithm the [I]nth value[/I] is Composite. Could anybody confirm or refute that this primality algorithm is true for all Composites/Primes. Would the above constitute a legitimate primality algorithm? [I]Caveat: Not sure if anybody else worked on or published any similar primality algorithms, but I could not find anything similar looking it up in Wikipedia. 
[QUOTE=gophne;476754]Hi Everybody
Time for playing with another possible new/unusual ? Primality Algorithm on the [I]mersenneforum.org>Extra Stuff>Blogorrhea>gophne>"New"primality test/check Forum.[/I] Not sure if anybody would be interested, but at risk of missing out on a possible "new" primality test/algorithm, which is the equivalent to missing out on M50 The Algorithm is as follows; For the series of odd numbers, where the 1st term of the series is 1, to the [I]nth term VALUE[/I], increment 2, step 1, where n is an element of the set of integers =>1, and the nth term results in the series having an odd number of terms, In sigma notation the series would look something like this: nth (VALUE) Z (2n1) [I]Z representing Sigma[/I] in lieu of n=1 [B]Then for any of the terms of the series so constituted, starting at the 2nd term to the term whose value would be equal to the value of the [I]nth value[/I] 2. If any of the terms in the test range is a multiple of, or shares a common factor with, the [I]nth value[/I], then the [I]nth value[/I] is Composite, else Prime. [/B] To do an example just so to clarify the wording (I am not familiar with Latex on this Site yet); Let the [I]nthvalue[/I] be equal to 7, then the algorithm series, applying Sigma, would be; 01,03,05,07 The test "range" according to the algorithm would be; 03,05......................................................Constituting the "seed" terms in the test range. 03 + 14 =17 05 + 14 =19 Since both these derived values are not multiples, or do not share a common factor with any of the "seed" terms, the [I]nth value[/I] , namely "7" is Prime, as per the algorithm. Let us try [I]nth value[/I] is equal to 2047 The test range would be; [B]03,05,07,09,11,13,15.......2045[/B].........Test range/Seed terms For this test range the derived algorithm values would be [B][I]seed terms[/I] + 4094[/B] The Seed/Algorithm pairs formed would be; (3,34097) (5,4099),(7,5001),(9,5003), (11, 5005), (13,5007)...... But the 5th term of these pairings share a common factor "11", so according to the algorithm the [I]nth value[/I] is Composite. Could anybody confirm or refute that this primality algorithm is true for all Composites/Primes. Would the above constitute a legitimate primality algorithm? [I]Caveat: Not sure if anybody else worked on or published any similar primality algorithms, but I could not find anything similar looking it up in Wikipedia.[/QUOTE] Offuscated trial division. 
[QUOTE=science_man_88;476758]Offuscated trial division.[/QUOTE]
Hi science_man_88 I love you, but your monosyllabic answers does not help. Please explain some more or even better, do the "offuscated trial division" for the two examples given. Thanx. 
So your algorithm:
Take odd N (which we're checking if it is prime or composite) For all odd n = 1,3,... < N, check if gcd(n, n+2*N) > 1. if so, you found a factor of N, and it is composite, otherwise it is prime. Trial division algortihm: Take odd N (which we're checking if it is prime or composite) For all odd [B]prime[/B] n = 3,5,7,.. < [B]sqrt[/B](N), check if n divides N. if so, you found a factor of N, and it is composite, otherwise it is prime. As you can see, you're just doing a much slower version of trial division. What? The gcd(n, n+2*N) is nothing like checking if "n divides N"? Please... Stop kidding yourselves. 
[QUOTE=axn;476802]So your algorithm:
Take odd N (which we're checking if it is prime or composite) For all odd n = 1,3,... < N, check if gcd(n, n+2*N) > 1. if so, you found a factor of N, and it is composite, otherwise it is prime. Trial division algortihm: Take odd N (which we're checking if it is prime or composite) For all odd [B]prime[/B] n = 3,5,7,.. < [B]sqrt[/B](N), check if n divides N. if so, you found a factor of N, and it is composite, otherwise it is prime. As you can see, you're just doing a much slower version of trial division. What? The gcd(n, n+2*N) is nothing like checking if "n divides N"? Please... Stop kidding yourselves.[/QUOTE] Hmmmmm Not exactly I think. [B][I]Trial division[/I][/B] divided the TEST number by the set of odd numbers up to the sqaure root of that number, until a possible divisor (that leaves a residue of 0 is found). If no such divisors are found, the number is Prime. The algorithm states that if any of the [I]odd numbers + 2 times the nth Value[/I] in the [I]test range[/I] has a common factor,[I] independant of the number being tested[/I]. I know it would be cumbersome for very large numbers (so is [I]trial division[/I]), but I am trying to establish a "Primality" Test relationship here. There are many Primality Tests out there that are very cumbersome or even incalculable after a few steps, but are still recognised as Primality Tests nevertheless. Thanks for distilling "my" algorithm into a more clear form for readers to evaluate the "primality test" aspect of it. Much better than my attempt, so if readers could please use axn's form of it. 
[QUOTE=gophne;476807]Hmmmmm
Not exactly I think. [B][I]Trial division[/I][/B] divided the TEST number by the set of odd numbers up to the sqaure root of that number, until a possible divisor (that leaves a residue of 0 is found). If no such divisors are found, the number is Prime. The algorithm states that if any of the [I]odd numbers + 2 times the nth Value[/I] in the [I]test range[/I] has a common factor,[I] independant of the number being tested[/I]. I know it would be cumbersome for very large numbers (so is [I]trial division[/I]), but I am trying to establish a "Primality" Test relationship here. There are many Primality Tests out there that are very cumbersome or even incalculable after a few steps, but are still recognised as Primality Tests nevertheless. Thanks for distilling "my" algorithm into a more clear form for readers to evaluate the "primality test" aspect of it. Much better than my attempt, so if readers could please use axn's form of it.[/QUOTE] I don't think you've taken on board what axn meant by this: [QUOTE=axn;476802]What? The gcd(n, n+2*N) is nothing like checking if "n divides N"? Please... Stop kidding yourselves.[/QUOTE] The point is that, if n is odd, then gcd(n, n+2N) = gcd(n, N) and so you are just checking if n has any common factors with N. But if N is not prime, then the smallest n that has a common factor with N is the smallest prime factor of N, so in fact when you first find n such that gcd(n, n+2N) > 1, that n is a factor of N. So you're just doing inefficient trial division. I guess that technically counts as a "primality test" in that it does correctly determine primality or otherwise of N. But it's easy to come up with useless primality tests. Here's one: if N>4, then N is prime if and only if N does not divide (N1)!. Unlike trial division, it's got only one step, so what could possibly go wrong...? As for why gcd(n, n+2N) = gcd(n, N), it's time for another basic number theory lesson: Suppose d divides n and n+2N. Then d must divide their difference, which is 2N. But d divides n, which is odd, so d is odd, and thus d must divide N. Conversely, suppose d divides n and N. Then clearly d divides n+N+N = n+2N. So the sets of common factors of (n, n+2N) and (n, N) are exactly the same, and in particular gcd(n, n+2N) = gcd(n, N). 
Uncompromisingly monosyllabic
[QUOTE=gophne;476800]Hi science_man_88
I love you, but your monosyllabic answers does not help. Please explain some more or even better, do the "offuscated trial division" for the two examples given. Thanx.[/QUOTE]That phrase in quotes is not as you say, nor is it spelled right. I think "You hide a test which splits by small primes, one by one" may be good. Paul. 
Petition to call the simplified form of this test the GophneAxn Little Lemma.

[QUOTE=axn;476802]So your algorithm:
Take odd N (which we're checking if it is prime or composite) For all odd n = 1,3,... < N, check if gcd(n, n+2*N) > 1. if so, you found a factor of N, and it is composite, otherwise it is prime. Trial division algortihm: Take odd N (which we're checking if it is prime or composite) For all odd [B]prime[/B] n = 3,5,7,.. < [B]sqrt[/B](N), check if n divides N. if so, you found a factor of N, and it is composite, otherwise it is prime.[/QUOTE] Let's compare performance. If the number is composite, the time both take depends on their smallest prime factor p. Your algorithm takes (p1)/2 steps, while trial division takes pi(p)  1 steps (we're assuming an odd number), which is asymptotically p/log p. In the worst case p = sqrt(n) so yours takes O(sqrt(n)) operations and trial division takes O(sqrt(n)/log n) operations (including the cost of sieving). But gcd is more expensive than division, so really the algorithms differ by a factor more like log^2 n. If the number is prime, your algorithm takes (n3)/2 divisions which is O(n). Trial division needs only O(sqrt(n)/log n). So for composites your algorithm is worse, but it's not too bad. But it's catastrophically bad on primes. 
In that case, we need an amendment: we first do an AKS test to see if the number is prime. If it is not prime, then we do the gophone algorithm...

[QUOTE=10metreh;476808]I don't think you've taken on board what axn meant by this:
The point is that, if n is odd, then gcd(n, n+2N) = gcd(n, N) and so you are just checking if n has any common factors with N. But if N is not prime, then the smallest n that has a common factor with N is the smallest prime factor of N, so in fact when you first find n such that gcd(n, n+2N) > 1, that n is a factor of N. So you're just doing inefficient trial division. I guess that technically counts as a "primality test" in that it does correctly determine primality or otherwise of N. But it's easy to come up with useless primality tests. Here's one: if N>4, then N is prime if and only if N does not divide (N1)!. Unlike trial division, it's got only one step, so what could possibly go wrong...? As for why gcd(n, n+2N) = gcd(n, N), it's time for another basic number theory lesson: Suppose d divides n and n+2N. Then d must divide their difference, which is 2N. But d divides n, which is odd, so d is odd, and thus d must divide N. Conversely, suppose d divides n and N. Then clearly d divides n+N+N = n+2N. So the sets of common factors of (n, n+2N) and (n, N) are exactly the same, and in particular gcd(n, n+2N) = gcd(n, N).[/QUOTE] Love this! At least I am getting free math lessons, at the higest levels :) 
Lucky Polynomials
Hi Everybody
We know that the polynomials of Euler/Legendre [B]x^2+x+41[/B] and [B]x^2x+41[/B] generate the series of primes called Euler numbers; 41 43 47 53 61 71 83 97 113 131 151 173 197 223 251 281 313 347 383 421 461 503 547 593 641 691 743 797 853 911 971 1033 1097 1163 1231 1301 1373 1447 1523 1601 for the first 40 terms. Question, is it also commonly known that these values are also apparently generated by the polynomials; x^2+3x+43 and x^23x+43 x^2+5x+47 and x^25x+47 x^2+7x+53 and x^27x+53........ That is each new polynomial being f(x)=x^2+ x+41 plus 2x+2, f(x)=x^2+3x+43 plus 2x+4 f(x)=x^2+5x+47 plus 2x+6 f(x)x^2+7x+53 plus 2x+8....... At least for the next +36 polynomials as well. i could not find these "derived' polynomials mentioned anywhere, except the Euler/Legendre forms of the polynomial. Caveat: Not sure if similar derived polynomials have been published/discussed else where. Not encountered in my searches for same in Wikipedia. 
Yes, this is known  they are shifts of the polynomial. With f(x)=x^2+ x+41, your polynomials are f(x+1), f(x2), f(x+2), f(x3), f(x+3), and f(x4).

See also:
Franz Lemmermeyer, "Binary Quadratic Forms", November 8, 2010, p.42 Theorem 1.47 
Trivial Poll
How many contributers consider the distribution of prime numbers to be "random" or "psuedorandom" vs contributers who consider prime numbers to be very orderly distributed in the set of natural numbers?
Poll to be closed on 16/03/2018 ([I]Would appreciate somebody that could summarize the poll results :)[/I]) "There are two facts about the distribution of prime numbers. The first is that, [they are] the most arbitrary and ornery objects studied by mathematicians: they grow like weeds among the natural numbers, seeming to obey no other law than that of chance, and nobody can predict where the next one will sprout. The second fact is even more astonishing, for it states just the opposite: that the prime numbers exhibit stunning regularity, that there are laws governing their behavior, and that they obey these laws with almost military precision."  Don Zagier, as quoted in Elementary Number Theory: Primes, Congruences, and Secrets William Stein, January 23, 2017 Poll options; (1) Random or Psuedorandom (2) Regular, or (3) Neither 
I don't think you have any idea what "regular" and "random" mean. This is a math forum. Be precise.
You've worded your question so vaguely that in its present form I'm pretty sure the answer is "yes". 
LOL..."yes" is not a poll option!

Prime numbers are very regular and orderly distributed. The order in which they are distributed is called "random order".
:razz: 
[QUOTE=gophne;480237]... there are laws governing their behavior, and that they obey these laws with almost military precision."[/QUOTE]There are no laws governing their (primes) behaviour. Instead the "laws" are formulated [i]from[/i] their behaviour. Primes don't obey anyone or anything, they are what they are, without any concern for how humans like to categorise them. You can't make a new set of laws and expect the primes to follow just because you say they should.

[QUOTE=LaurV;480243]Prime numbers are very regular and orderly distributed. The order in which they are distributed is called "random order".
:razz:[/QUOTE] Random order?.Oxymoron? 
[QUOTE=retina;480246]There are no laws governing their (primes) behaviour. Instead the "laws" are formulated [i]from[/i] their behaviour. Primes don't obey anyone or anything, they are what they are, without any concern for how humans like to categorise them. You can't make a new set of laws and expect the primes to follow just because you say they should.[/QUOTE]
Quote is by Don Zagier...... ([url]https://en.wikipedia.org/wiki/Don_Zagier[/url]) 
[QUOTE=gophne;480275]Random order?.Oxymoron?[/QUOTE]
Not really, even order of operations was potentially random before the convention we now follow. 
[QUOTE=gophne;480275]Oxymoron?[/QUOTE]
Well some people just called me moron, nobody called me oxymoron, so I assume this is a step up... :razz: (sorry man, I could not resist, haha) 
[QUOTE=LaurV;480463]Well some people just called me moron, nobody called me oxymoron, so I assume this is a step up... :razz:
(sorry man, I could not resist, haha)[/QUOTE] [URL="https://www.youtube.com/watch?v=hgMn4u61z8c&t=20s"]https://www.youtube.com/watch?v=hgMn4u61z8c&t=20s[/URL] [URL="https://www.youtube.com/watch?v=hgMn4u61z8c&t=136s"]https://www.youtube.com/watch?v=hgMn4u61z8c&t=136s[/URL] 
[QUOTE=gophne;480237]How many contributers consider the distribution of prime numbers to be "random" or "psuedorandom" vs contributers who consider prime numbers to be very orderly distributed in the set of natural numbers?
Poll to be closed on 16/03/2018 ([I]Would appreciate somebody that could summarize the poll results :)[/I]) "There are two facts about the distribution of prime numbers. The first is that, [they are] the most arbitrary and ornery objects studied by mathematicians: they grow like weeds among the natural numbers, seeming to obey no other law than that of chance, and nobody can predict where the next one will sprout. The second fact is even more astonishing, for it states just the opposite: that the prime numbers exhibit stunning regularity, that there are laws governing their behavior, and that they obey these laws with almost military precision."  Don Zagier, as quoted in Elementary Number Theory: Primes, Congruences, and Secrets William Stein, January 23, 2017 Poll options; (1) Random or Psuedorandom (2) Regular, or (3) Neither[/QUOTE] My poll is not proving to be very popular...only had a prime number of responses to the Poll...1+1 responses so far...including my own respose! For what it is worth I am more inclined to agree with the latter part of the premises stated by Don Zagier that prime numbers are very "ordered" and the problem of identifying large primes is more a result of the largeness of the numbers concerned w.r.t computational time required to confirm them as prime numbers, even on the most advanced computers available. The secret would be to discover (or develop?) a formula for primes, which would have the effect of determining primes as high as we could go. My opinion is that since prime numbers are natural numbers missed by the process of sieving by lower natural numbers...the great mathematical minds around should be able to quickly! formulate such formula/algorithm for prime numbers. 
[QUOTE=gophne;481751]
The secret would be to discover (or develop?) a formula for primes, which would have the effect of determining primes as high as we could go. My opinion is that since prime numbers are natural numbers missed by the process of sieving by lower natural numbers...the great mathematical minds around should be able to quickly! formulate such formula/algorithm for prime numbers.[/QUOTE] It exists already, take the positive values of a certain (fairly complex) polynomial, it's probably not ordered well, or it's just slow to implememt. 
Twin Prime Conjecture
I am by now probably known for my exorbitant claims, but nevertheless I would like still to have a crack at offering a "proof" for the twin prime conjecture shortly, in a month or two for te most. I am not sure if posting in the
[I]"mersenneforum.org>Extra Stuff>Bloggorrhea>gophne>"New" primality test/check"[/I] would be the right forum to do so (although I believe it would be). Not sure if I would get enough respect for my attempt or that it would be rejected out of hand due to the audacity of trying. I have stated from the first time that I joined the mersenne forum to air my "discovery" of the sum of consecutive prime "sums" generates very "smooth" curves which lends itself to being "predictive" probably to more or less than of the trivial formula for the gap between primes of [I]log N[/I] I had to disengage with my tail between my legs as the super mods on the Site felt that I made unsubstantiated claims w.r.t the accuracy of this graphic algorithm. At the time I stated that I wanted to work on new "primality" algorithms while yelping off. I did not make many friends as well with my "reverse algorithm" of the mersenne numbers being divisible by the mersenne index 2 relationship, which was shown to be a variation/copy of Fermat's Little theorem, including the false positives! I have also offered a "primality" algorithm which revolves around doubling of an odd number to be tested for primality . If no odd number smaller than the number being tested shares a common factor with the "sum of the that odd number with double the number being tested", then the number being tested is a prime number. This "algorithm" has been highligthed as a "variation" of "trial division", but more cumbersome in computer calculation time! So needless to say that besides the attempted proof of the "Twin prime conjecture", I am also working on an alternative primality check to the LucasLehmer, which is showing great promise I am working on the complexity of the algorithm at very large values in the ranges of the higher mersenne primes. The algorithm is a sieve/formulaic hybrid which I am hoping to air on this forum as well at some point, if I am not debarred from the Site before that as a Swengali! This algorithm has proven to be true for the lower mersenne numbers primes (my assertion only), so is most likely, or shall I be bold and say "definitely", true for the higher mersenne numbers as well, I am just not sure of the complexity of the algorithm in terms of computational time required at the top levels/magnitude of the known mersenne primes. I however beg for indulgence, as this is a "blogorrhea" thread after all, so therein only lies my dilemma about posting such on this Site/Tread. This I will attempt without fear of ridicule, but with a danger that I will be declared a lunatic of the highest order. So shall my journey begin in due course. 
[QUOTE=gophne;482159]
So needless to say that besides the attempted proof of the "Twin prime conjecture", I am also working on an alternative primality check to the LucasLehmer, which is showing great promise I am working on the complexity of the algorithm at very large values in the ranges of the higher mersenne primes. The algorithm is a sieve/formulaic hybrid which I am hoping to air on this forum as well at some point, if I am not debarred from the Site before that as a Swengali! This algorithm has proven to be true for the lower mersenne numbers primes (my assertion only), so is most likely, or shall I be bold and say "definitely", true for the higher mersenne numbers as well, I am just not sure of the complexity of the algorithm in terms of computational time required at the top levels/magnitude of the known mersenne primes. [/QUOTE] I hope you know many variant and attempts have been found to be useless for the first paragraph. And for the second paragraph maybe read: [url]https://en.m.wikipedia.org/wiki/Strong_Law_of_Small_Numbers[/url] Or watch: [url]https://www.youtube.com/watch?v=4UgZ5FqdYIQ[/url] 
[QUOTE=gophne;482159]I would like still to have a crack at offering a "proof" for the twin prime conjecture shortly, in a month or two for te most. I am not sure if posting in the
[I]"mersenneforum.org>Extra Stuff>Bloggorrhea>gophne>"New" primality test/check"[/I] would be the right forum to do so (although I believe it would be).[/QUOTE] Yes, that would be fine. You could also use misc math, but I think this would be better. A new thread would be better than reusing this one, I think. [QUOTE=gophne;482159]I have stated from the first time that I joined the mersenne forum to air my "discovery" of the sum of consecutive prime "sums" generates very "smooth" curves which lends itself to being "predictive" probably to more or less than of the trivial formula for the gap between primes of [I]log N[/I][/QUOTE] My hope is that we can help hone that general idea into a concrete statement and some theorems you can claim. :smile: [QUOTE=gophne;482159]I did not make many friends as well with my "reverse algorithm" of the mersenne numbers being divisible by the mersenne index 2 relationship, which was shown to be a variation/copy of Fermat's Little theorem, including the false positives![/QUOTE] Everyone here has had the experience of rediscovering results, we don't mind that part. But you made grandiose claims about the algorithm which did not hold up to casual scrutiny, and that makes us worry about your other claims. If nothing else it should be a wakeup call reminding you of the importance of writing your proof carefully. [QUOTE=gophne;482159]I have also offered a "primality" algorithm which revolves around doubling of an odd number to be tested for primality . If no odd number smaller than the number being tested shares a common factor with the "sum of the that odd number with double the number being tested", then the number being tested is a prime number. This "algorithm" has been highligthed as a "variation" of "trial division", but more cumbersome in computer calculation time![/QUOTE] Well... that is a very inefficient algorithm, hundreds of thousands of times slower than trial division at 9 digits (and quickly growing worse). But perhaps you had some other reason for presenting it other than efficiency. [QUOTE=gophne;482159]So needless to say that besides the attempted proof of the "Twin prime conjecture", I am also working on an alternative primality check to the LucasLehmer, which is showing great promise I am working on the complexity of the algorithm at very large values in the ranges of the higher mersenne primes. The algorithm is a sieve/formulaic hybrid which I am hoping to air on this forum as well at some point, if I am not debarred from the Site before that as a Swengali![/QUOTE] You'll need very careful proofs for both. I wish you the best. [QUOTE=gophne;482159]This algorithm has proven to be true for the lower mersenne numbers primes (my assertion only), so is most likely, or shall I be bold and say "definitely", true for the higher mersenne numbers as well, I am just not sure of the complexity of the algorithm in terms of computational time required at the top levels/magnitude of the known mersenne primes.[/QUOTE] How many nonMersenne exponents have you tested it on? "Definitely" is far too strong in any case; even if you had tested it on billions it would only be enough evidence to think it a probable prime test. But such tests definitely exist, and who knows, it might even be a primality test  just be scrupulous in your proof! 
Thanx I will look at the links.

Hi CRGreathouse
Thanx for the encouragement. I hope I can do justice to your positivity. I shall try to formulate my attempted "proof" carefully and mathematically as sound as possible. I will be delighted if the "proof" has any merit if somebody with more math knowledge and skills could take it the next level, but I am getting ahead of myself. I will be posting on this thread as I would not want to compromise the integrity and reputation of the forum by possibly posting a worthless, clueless attempted proof. If the proof should have any merit after analysis by the normal commentators, it would garner a life of its own I am sure. As for the primality test (for mersenne primes) it is formulaic in nature, so it is like proving the lemma for the lower mersennes, and then extending it to the rest of the set of mersenne numbers. My problem being that the "primality" check could well be too unwieldy at very high numbers. I do not have a good understanding of algorithm complexity and computation time concepts. I shall try my very best to validate whatever I post with due respect to the readers. 
[QUOTE=gophne;482175]Thanx I will look at the links.[/QUOTE]
[url]https://en.m.wikipedia.org/wiki/List_of_mathematical_jargon[/url] [url]https://en.m.wikipedia.org/wiki/Mathematical_proof[/url] [url]https://oeis.org/wiki/List_of_LaTeX_mathematical_symbols[/url] May all help you put it into an understandable form. [url]https://en.m.wikipedia.org/wiki/List_of_mathematical_symbols[/url] may as well. 
[QUOTE=gophne;482176]As for the primality test (for mersenne primes) it is formulaic in nature, so it is like proving the lemma for the lower mersennes, and then extending it to the rest of the set of mersenne numbers. My problem being that the "primality" check could well be too unwieldy at very high numbers. I do not have a good understanding of algorithm complexity and computation time concepts.[/QUOTE]
Think about it in four stages. 1. Define it precisely and unambiguously. Until you do this, you [url=https://www.theguardian.com/science/2005/sep/19/ideas.g2]can't even be wrong[/url]. 2. Prove that your test holds for all primes (of the appropriate form). Once you do this you have a pseudoprimality test (but its value is still unclear). 3. Prove that your test fails to hold for all composites (of the appropriate form). Once you do this you have a primality test (but possibly a slow one). 4. Prove that your test runs in a (small) certain amount of time based on the size of the input and perhaps other characteristics. There are inbetween stages. Before you get to 3 you can perform random testing to show that your test fails for either all composites tested or a very high percentage. Before you get to 4 you can use heuristics or implementations to argue that it will be fast. Really you need to get to at least stage 2 before any of this matters though. You don't need to get to 4 to be of value. BPSW is at 2, and MillerRabin is stuck there permanently. ECPP looks like it will be at 3 for the foreseeable future. 
[QUOTE=CRGreathouse;482186]Define it precisely and unambiguously. Until you do this, you [url=https://www.theguardian.com/science/2005/sep/19/ideas.g2]can't even be wrong[/url].[/QUOTE]
Along these lines, a very important thing to point out... It's perfectly OK to be wrong around here. It's a tough crowd, but being comfortable making mistakes is useful in making progress. Some have run away from this forum because they thought they were being personally attacked. New ideas are welcomed, but are also open to deep analysis, and will be called stupid if they are. Nothing personal; deal with it. It's a bit like the game of ruby, where you try to physically maim your opponent on the field, and then afterwards go and buy the guy a beer and laugh. Or the game of Go, where it's all fun and games until someone loses an eye. 
Noted, thank you.

Agreed.
I have always accepted the critisisms I encountered on the forum in good faith. If I can also make use of a sporting analogy, it is like, as a novice tennis player, taking on Jimmy Connors, with Jimmy Connors sometimes feeling like hitting you on the head with the ball, for your audacity, and rightly so, if you cannot even get the ball to go consistently into the service court. You just have to hope that you do not come up against a John McEnroe, who will likely bash the ball boy over the head with his racquet, due to your doltishness! 
[QUOTE=chalsall;482192]Along these lines, a very important thing to point out... It's perfectly OK to be wrong around here. It's a tough crowd, but being comfortable making mistakes is useful in making progress.[/QUOTE]
Yes! :smile: We all make mistakes, and it's good to point out mistakes you see so they can be corrected. 
Dedicated to my daugter, Taryn
I would like to make my attempt at a proof of the [I]Twin Prime Conjecture[/I] on the 17/04/2018, which would be my daughter's birthday, hoping to make up for the bad days.

"Mathematicians have tried in vain to this day to discover some order in the sequence of prime numbers, and we have reason to believe that it is a mystery into which the mind will never penetrate."
Leonard Euler, in G. Simmons, Calculus Gems, McGrawHill, New York, 1992 **************************************************************************************************** "Prime numbers have always fascinated mathematicians, professional and amateur alike. They appear among the integers, seemingly at random, and yet not quite: there seems to be some order or pattern, just a little below the surface, just a little out of reach." Underwood Dudley, Elementary Number Theory (Freeman, 1978) p.163 **************************************************************************************************** "Who would have imagined that something as straightforward as the natural numbers (1, 2, 3, 4,...) could give birth to anything so baffling as the prime numbers (2, 3 ,5, 7, 11, ...)?" Ian Stewart, "Jumping Champions", Scientific American, December 2000 
[QUOTE=gophne;482576]I would like to make my attempt at a proof of the [I]Twin Prime Conjecture[/I] on the 17/04/2018, which would be my daughter's birthday, hoping to make up for the bad days.[/QUOTE]
Considering how many great minds have worked on this for hundreds of years, most recently Professor Terence Tao, and Dr. James Maynard: [url]https://www.youtube.com/watch?v=MXJzpJeY3E[/url] [url]https://www.youtube.com/watch?v=QKHKD8bRAro[/url] you might want to aim for 17/04/2030 or later to have any non infinitesimal chance at it. 
Another thing to keep in mind is the interconnectedness of conjectures. The twin prime conjecture and the goldbach conjecture are linked by goldbach partitions. So you'll have to potentially look out for provably false implications of conjectures against each other. Not that one will necessarily be false because of the other, just a thing to keep in mind.

[QUOTE=ATH;482705]Considering how many great minds have worked on this for hundreds of years, most recently Professor Terence Tao, and Dr. James Maynard:
[url]https://www.youtube.com/watch?v=MXJzpJeY3E[/url] [url]https://www.youtube.com/watch?v=QKHKD8bRAro[/url] you might want to aim for 17/04/2030 or later to have any non infinitesimal chance at it.[/QUOTE] The task is daunting indeed....under normal circumstances 2030 would be optimistic as well!!! more like 3020! would make more sense. Prof Tao and Dr. Maynard are exceptional math geniuses/minds who are able to understand/theorize in all things math. Myself, I am only going to attempt a "proof" for the twin prime conjecture which I think works, not withstanding that it deals in the realms of infinity. Are you perhaps suggesting that I should choose the 1st of April, instead of the 17 of April! :) I only chose 17 of April for sentimental reasons because it is my daughter's birthday on that day. 
[QUOTE=science_man_88;482707]Another thing to keep in mind is the interconnectedness of conjectures. The twin prime conjecture and the goldbach conjecture are linked by goldbach partitions. So you'll have to potentially look out for provably false implications of conjectures against each other. Not that one will necessarily be false because of the other, just a thing to keep in mind.[/QUOTE]
Hi Scienceman88 About Goldbach conjecture and other things math I am really very clueless...my big problem is that I am a "prime" hobbyist, and that I have always considered prime numbers to be "ordered" rather than "random". A pattern for Primes of course does exist...a [I]forward pattern![/I] as delivered by operation of the Sieve of Eratosthenes. It is only a quest to find the reverse pattern! 
1 Attachment(s)
[QUOTE=gophne;482763]The task is daunting indeed....under normal circumstances 2030 would be optimistic as well!!! more like 3020! would make more sense.[/QUOTE]
Daunting to be sure, but I think there is reason for hope. Much progress has been made recently, and a mathematician who has all of the tools in her/his toolbox will be better able to attack the problem. The Prime Number Theorem showed the ("trivial") estimate that the gap following a prime p is infinitely often at most length log p or so. Sieve theory was developed to work on problems like the twin prime conjecture, and the Brun sieve (1915) was developed to prove that the sum of the reciprocals of the twin primes converged. Westzynthius made the first improvement over the trivial bound, with further (slow) progress through the 1930s by Erdős and Rankin. Linnik developed sieve theory further with the [url=https://terrytao.wordpress.com/2015/01/10/254anotes3thelargesieveandthebombierivinogradovtheorem/]large sieve[/url] in 1941. It uses Fourier analysis and analytic number theory to allow larger sieving sets. The Selberg sieve was also developed around this time, this one being a combinatorial sieve rather than analytic. Careful choice of sieve weights are essential in this method. Using the large sieve, Barban, Bomberi, and Vinogradov proved a theorem which gives a Riemann Hypothesistype error bound to primes in arithmetic progressions when the modulus is up to roughly square root size. The ElliottHalberstam conjecture (EH) says that the modulus can be taken almost up to the size of the bounding variable itself. This kind of information would be useful in a proof of the twin prime conjecture. Chen's theorem (1966) was the first nearproof of the twin prime conjecture, limited by the [url=https://en.wikipedia.org/wiki/Parity_problem_(sieve_theory)]parity problem[/url]. It proves that there are infinitely many primes p such that p+2 is either prime or semiprime. Ross published a simplification using the Selberg sieve. [url=https://people.math.ethz.ch/~kowalski/fouvryiwanieconatheorem.pdf]Fouvry & Iwaniec (1980)[/url] published an improved version of Bomberi and Vinogradov giving some information above the square root barrier, which in principle makes the twin prime problem attackable. The FriedlanderIwaniec theorem (1997) was essentially the first work to begin to break down the parity barrier, using a great deal of analytic number theory along with combinatorics and Fourier analysis. Various subsets of Goldston, Graham, Motohashi, Pintz, and Yıldırım (~2005; [url=https://arxiv.org/abs/math/0505300]two[/url] [url=https://arxiv.org/abs/math/0506067]examples[/url]) prove many results about small gaps between primes. One example: On EH, there are infinitely many primes p followed by a gap of at most 16. Sieve weights and Selberg diagonalization are tools. [url=https://arxiv.org/abs/math/0606088]Green & Tao (2006)[/url] published probably the last major twin primerelated paper before the Zhang bomb dropped, and it gives an idea of what the state of knowledge was at that time. Zhang's theorem, at its heart, is an improvement similar to Fouvry & Iwaniec along with the ingredients needed to apply it to the bounded gap problem. It proves that there are infinitely many primes p followed by a gap of at most 70000000. It may be the largest single step we've taken toward the twin prime conjecture. The polymath project simplified and improved Zhang's proof and reduced the gap to ~2000. Maynard added some new ingredients to the proof which allowed a further reduction, then followed up with a new paper with the Tao et al. Polymath project to give a bound of 246 (and a bound of 6 on a stronger form of EH). [url=https://terrytao.wordpress.com/2018/02/21/longgapsinsievedset/]Ford, Konyagin, Maynard, Pomerance, & Tao (2018)[/url] published a paper, improving on other recent ones I'm too tired to survey at the moment, on long gaps between primes. There seems to be a duality between short gaps and long gaps and this team seems to be unraveling the mystery. So there really is a lot out there, if you have the patience to learn it. I have quite a lot of learning to do myself. Edit: Here's a very rough chart of the connections between the steps leading us to current progress toward the twin prime conjecture. All mistakes are mine. 
Twin Prime Conjecture
Hi CRGreathouse
Thank you for all the information in your post about work done in the area of a possible proof of the Twin Prime Conjecture and related gaps between prime numbers. I will try to look up some of these works to see if I am not going down a path which has already been researched (professionally) and has not managed to nail down a proof for the TPC, as not to be a bore. However, I still intend to post an attempted "proof" on the 17th as I have promised. My attempt would be based on the Sieve of Eratosthenes, Euclid's proof of the infinity of prime numbers, and a logical deduction based on the relationships/statements made/derived from the system of the "proof". What I would like to know is if it is possible to [I]cut & paste[/I] text and tables from Excel & Word on this platform, as I have worked mostly in Excel. I hope that if my "proof" is not valid, that at least that it would have been a novel approach to the problem. Even at this late stage I am still struggling with my "logical deduction" on which the [I]proof[/I] is based, since to me it appears sound, but there is a fogginess to it as one considers the abstraction of the logic at numbers that cannot be checked in normal computer time (using Excel or SAGE!). The "proof" is not very long/extensive! SO would not waste too much of people's brain power! 
[QUOTE=gophne;484940]I hope that if my "proof" is not valid, that at least that it would have been a novel approach to the problem. Even at this late stage I am still struggling with my "logical deduction" on which the [I]proof[/I] is based, since to me it appears sound, but there is a fogginess to it as one considers the abstraction of the logic at numbers that cannot be checked in normal computer time (using Excel or SAGE!).[/QUOTE]
Please make all efforts to be clear. Don't invent terms unless you must, and if you do you need to give clear, concise definitions. It should be evident at each step precisely what you are claiming and why you think it is true. Here's my rough scale for grading purported proofs of the twin prime conjecture: 1,000,000 points for a correct proof of the twin prime conjecture 100 points for a clear attempted proof with interesting, nontrivial, correct, and possible publishable theorems/lemmas 0 points for a clear attempted proof 10 points for a vague, hardtofollow, or similar attempted proof 11 points for an attempted proof that is unsalvageable or 'not even wrong' 11 points for the variants of [url=http://mersenneforum.org/showthread.php?t=23092]the standard proof[/url] The average score here, unfortunately, is less than 10. :down: 
[QUOTE=CRGreathouse;484962]Please make all efforts to be clear. Don't invent terms unless you must, and if you do you need to give clear, concise definitions. It should be evident at each step precisely what you are claiming and why you think it is true.
Here's my rough scale for grading purported proofs of the twin prime conjecture: 1,000,000 points for a correct proof of the twin prime conjecture 100 points for a clear attempted proof with interesting, nontrivial, correct, and possible publishable theorems/lemmas 0 points for a clear attempted proof 10 points for a vague, hardtofollow, or similar attempted proof 11 points for an attempted proof that is unsalvageable or 'not even wrong' 11 points for the variants of [url=http://mersenneforum.org/showthread.php?t=23092]the standard proof[/url] The average score here, unfortunately, is less than 10. :down:[/QUOTE] Noted! 
[QUOTE=gophne]What I would like to know is if it is possible to [I]cut & paste[/I] text and tables from Excel & Word on this platform, as I have worked mostly in Excel.[/QUOTE]
You could cut paste the document into google sheets, and copy the link for that here. 
Hurdle
Struggling with my "logical deduction" to show that the methodology I use to locate and predict the location of twin primes, would be applicable at values approaching infinity, or more importantly in terms of the requirements of a proof, that twin primes could still be occuring approaching infinity. If I cannot resolve this shortly, then I will post the "imperfect" proof in order for others to perhaps debunk the approach from their vantage point, if anybody might be interested to do so.
I shall post on Google Docs as suggested. Apologies for not posting on the date intended, but I shall post what I have done in a few days time, even if I cannot resolve the problem (proof) myself, just for comment as to the worth or nonworth of the attempt/approach. 
[QUOTE=gophne;486062]Apologies for not posting on the date intended[/QUOTE]
Thanks for prioritizing intellectual rigor. 
All times are UTC. The time now is 18:19. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2023, Jelsoft Enterprises Ltd.