![]() |
![]() |
#1 |
Jan 2018
5 Posts |
![]()
Hi,
I've read this: https://www.mersenne.org/various/math.php Overall, I'm wondering if GIMPS achieves 100% certainty on the primality test? Does the double-checking step ensures 100% certainty and if so, can you please elaborate a little bit on how it works? Why re-running Lucas-Lehmer ensure 100% certainty? What changes between runs? As side questions, I was wondering : 1) Is the convolution error check so slow, that hypothetically, it could not be used at every step and finish in less than, say, 10 years? 2) Have you tested the all-integer weighted transform which as far as I can tell should be lossless? Hypothetically, could it not be used to perform a primarily test and finish in less than, say, 10 years? Thank you, |
![]() |
![]() |
![]() |
#2 |
Romulan Interpreter
"name field"
Jun 2011
Thailand
10,273 Posts |
![]()
It is called "shift". See here. (the link is to the last section, called "double checking", but the page is built in such a way that you can't go below the bottom of the page, that is why it seems that link starts in the middle of the "LL" section)
|
![]() |
![]() |
![]() |
#3 | |
Undefined
"The unspeakable one"
Jun 2006
My evil lair
2×32×7×53 Posts |
![]() Quote:
So for there to be a missed prime there would have to be two bad tests that manage to match on the final 64-bits. By my probably incorrect maths that would be approximately a (1 in 2^64) * (4%^2) chance, where 4% is the average error rate. Primes are a wholly different manner and are checked much more thoroughly, many times on many systems. The chances of a false prime are considerably smaller than having a missed prime. Still not 100%, but approaching very very close to it. Last fiddled with by retina on 2018-01-27 at 15:01 Reason: "a having" --> "having a" |
|
![]() |
![]() |
![]() |
#4 |
6809 > 6502
"""""""""""""""""""
Aug 2003
101×103 Posts
5·7·311 Posts |
![]()
Interim residues for primes are also cross checked. Such that the chance of errors is even lower. If every 10,000,000 iterations we compare the residue, that drives down probability of errors causing the a false positive. Also, using different FFT sizes reduces the probability that other errors sneak in.
|
![]() |
![]() |
![]() |
#5 |
Jan 2018
5 Posts |
![]()
edited
Is 100% achievable in a practical amount of processing time (say 10 years)? Maybe we've done it before for some big Mi? Last fiddled with by 6pac on 2018-01-27 at 15:50 |
![]() |
![]() |
![]() |
#6 |
Sep 2003
3×863 Posts |
![]()
As LaurV pointed out, the shift changes.
Also the hardware changes. Different people use different computers. The principal source of error is hardware problems. Sometimes people overclock their machines past the point of reliability, sometimes a bit gets flipped by cosmic rays, or maybe the hardware is just crap to begin with. It takes only one tiny error in one calculation among tens of millions to invalidate the whole result. Also the chips change, and the version of the program changes. Maybe the first LL test was done eight years ago with a Nehalem microarchitecture that used SSE2 and the second LL test was done a few months ago with a Skylake using AVX512. Or maybe the second test used a different program altogether like Mlucas, or CUDALucas and gpuOwL, which use the graphics card instead of the CPU. The potential sources of erroneous results include hardware unreliability, algorithmic implementation flaws, and human malice. Current procedures provide a high degree of confidence, but I don't know what would constitute "100%" certainty that a Mersenne number is composite, short of discovering a factor. |
![]() |
![]() |
![]() |
#7 |
Jan 2018
5 Posts |
![]()
You can get 100% certainty by computing Lucas-Lehmer using fix-point math. Distinct hardware gives different floating point results (FPU, SIMD, GPU). ALU though is deterministic across same family of hardware (I've first hand experience regarding this on x86-64). Now can you compute Lucas-Lehmer using fix-point math in a practical amount of time, I dont know.
Last fiddled with by 6pac on 2018-01-27 at 20:03 |
![]() |
![]() |
![]() |
#8 | |
"Forget I exist"
Jul 2009
Dartmouth NS
100000111000102 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#9 |
Jan 2018
5 Posts |
![]()
I see what you mean. I guess there is two possible views on that. You could be a nazi (for lack of better term) and say "because of say, programming errors, 100% is not doable, you will always have to cross-check the result" (which is correct). Or you could say "we could complete a fix-point math calculation - for example -, cross-check it, and assume we've achieved 100%". Both views are interesting in my opinion. Personally, I would like to know we've achieved the later just because of the challenge. Right now, assuming no hardware, programming, cosmic errors, we know we dont do 100% (or maybe we do).
Last fiddled with by 6pac on 2018-01-27 at 21:28 |
![]() |
![]() |
![]() |
#10 | |
"Forget I exist"
Jul 2009
Dartmouth NS
2·3·23·61 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#11 | |
Jan 2018
516 Posts |
![]() Quote:
Last fiddled with by 6pac on 2018-01-27 at 21:43 |
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Double checking | gd_barnes | Riesel Prime Search | 71 | 2022-10-02 17:21 |
Double Checking on GPU72 | bayanne | GPU to 72 | 17 | 2013-12-25 18:16 |
What about double-checking TF/P-1? | 137ben | PrimeNet | 6 | 2012-03-13 04:01 |
Double checking | Unregistered | Information & Answers | 19 | 2011-07-29 09:57 |
Double Checking Factors | eepiccolo | Software | 6 | 2003-03-10 05:01 |