20180127, 09:50  #1 
Jan 2018
5 Posts 
GIMPS doublechecking certainty rate
Hi,
I've read this: https://www.mersenne.org/various/math.php Overall, I'm wondering if GIMPS achieves 100% certainty on the primality test? Does the doublechecking step ensures 100% certainty and if so, can you please elaborate a little bit on how it works? Why rerunning LucasLehmer ensure 100% certainty? What changes between runs? As side questions, I was wondering : 1) Is the convolution error check so slow, that hypothetically, it could not be used at every step and finish in less than, say, 10 years? 2) Have you tested the allinteger weighted transform which as far as I can tell should be lossless? Hypothetically, could it not be used to perform a primarily test and finish in less than, say, 10 years? Thank you, 
20180127, 10:52  #2 
Romulan Interpreter
"name field"
Jun 2011
Thailand
10,273 Posts 
It is called "shift". See here. (the link is to the last section, called "double checking", but the page is built in such a way that you can't go below the bottom of the page, that is why it seems that link starts in the middle of the "LL" section)

20180127, 14:44  #3  
Undefined
"The unspeakable one"
Jun 2006
My evil lair
2×3^{2}×7×53 Posts 
Quote:
So for there to be a missed prime there would have to be two bad tests that manage to match on the final 64bits. By my probably incorrect maths that would be approximately a (1 in 2^64) * (4%^2) chance, where 4% is the average error rate. Primes are a wholly different manner and are checked much more thoroughly, many times on many systems. The chances of a false prime are considerably smaller than having a missed prime. Still not 100%, but approaching very very close to it. Last fiddled with by retina on 20180127 at 15:01 Reason: "a having" > "having a" 

20180127, 14:54  #4 
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
5·7·311 Posts 
Interim residues for primes are also cross checked. Such that the chance of errors is even lower. If every 10,000,000 iterations we compare the residue, that drives down probability of errors causing the a false positive. Also, using different FFT sizes reduces the probability that other errors sneak in.

20180127, 15:11  #5 
Jan 2018
5 Posts 
edited
Is 100% achievable in a practical amount of processing time (say 10 years)? Maybe we've done it before for some big Mi? Last fiddled with by 6pac on 20180127 at 15:50 
20180127, 16:15  #6 
Sep 2003
3×863 Posts 
As LaurV pointed out, the shift changes.
Also the hardware changes. Different people use different computers. The principal source of error is hardware problems. Sometimes people overclock their machines past the point of reliability, sometimes a bit gets flipped by cosmic rays, or maybe the hardware is just crap to begin with. It takes only one tiny error in one calculation among tens of millions to invalidate the whole result. Also the chips change, and the version of the program changes. Maybe the first LL test was done eight years ago with a Nehalem microarchitecture that used SSE2 and the second LL test was done a few months ago with a Skylake using AVX512. Or maybe the second test used a different program altogether like Mlucas, or CUDALucas and gpuOwL, which use the graphics card instead of the CPU. The potential sources of erroneous results include hardware unreliability, algorithmic implementation flaws, and human malice. Current procedures provide a high degree of confidence, but I don't know what would constitute "100%" certainty that a Mersenne number is composite, short of discovering a factor. 
20180127, 19:45  #7 
Jan 2018
5 Posts 
You can get 100% certainty by computing LucasLehmer using fixpoint math. Distinct hardware gives different floating point results (FPU, SIMD, GPU). ALU though is deterministic across same family of hardware (I've first hand experience regarding this on x8664). Now can you compute LucasLehmer using fixpoint math in a practical amount of time, I dont know.
Last fiddled with by 6pac on 20180127 at 20:03 
20180127, 20:12  #8  
"Forget I exist"
Jul 2009
Dartmouth NS
10000011100010_{2} Posts 
Quote:


20180127, 21:05  #9 
Jan 2018
5 Posts 
I see what you mean. I guess there is two possible views on that. You could be a nazi (for lack of better term) and say "because of say, programming errors, 100% is not doable, you will always have to crosscheck the result" (which is correct). Or you could say "we could complete a fixpoint math calculation  for example , crosscheck it, and assume we've achieved 100%". Both views are interesting in my opinion. Personally, I would like to know we've achieved the later just because of the challenge. Right now, assuming no hardware, programming, cosmic errors, we know we dont do 100% (or maybe we do).
Last fiddled with by 6pac on 20180127 at 21:28 
20180127, 21:15  #10  
"Forget I exist"
Jul 2009
Dartmouth NS
2·3·23·61 Posts 
Quote:


20180127, 21:41  #11  
Jan 2018
5_{16} Posts 
Quote:
Last fiddled with by 6pac on 20180127 at 21:43 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Double checking  gd_barnes  Riesel Prime Search  71  20221002 17:21 
Double Checking on GPU72  bayanne  GPU to 72  17  20131225 18:16 
What about doublechecking TF/P1?  137ben  PrimeNet  6  20120313 04:01 
Double checking  Unregistered  Information & Answers  19  20110729 09:57 
Double Checking Factors  eepiccolo  Software  6  20030310 05:01 