20030722, 18:43  #1 
Jul 2003
31 Posts 
about LucasLehmer test and Prime 95
My question is about primarity test of M numbers..
I am runing a primarity test of M16328749 and about every 20 minutes or so i got about 0.06% of total job done.. However it calls some confusion in my mind.. Does anybody knows if the test is stopped if it suddenly finds some facor of the number.. or it is continiued befor the number is compleetly factored.. To my mind if would be logical if the test stopped to run after first factor is found, but how can the test estimitate then how much percent of total work is done if it is not known when any factor may or may not appear.. Does the % mean the job done in % in case it really is the prime.. And then does it means that if the i see more than 90% of job finished i should allready start to get nervous:)? 
20030722, 19:04  #2 
Dec 2002
Frederick County, MD
370_{10} Posts 
The LucasLehmer test doesn't factor a number, it just test for primality. Trial Factoring is performed on a number before The LL test is performed, and if no factor is found, the the LL test will start.
An LL test always must complete to 100% to determine primality. Only then can the program determine if the number is prime or not, which it does by looking at the LL residue. 
20030722, 19:52  #3 
Jul 2003
31 Posts 
strange.. if trial factorign found no factors then this nuumber should already prime.. or trial factoring only does half job?
and i thought primarity test is still searching for factors  if no factors found then nuber is prime.. But then could anyone explain how works this test.. 
20030722, 20:08  #4 
Dec 2002
Frederick County, MD
2·5·37 Posts 
Trial factoring only looks for small factors. If no factors are found, then the program goes on to the LL test, which only proves or disproves primality; it tells nothing about the factors.
Here is a link to the math explaination on the mersenne.org site: http://www.mersenne.org/math.htm 
20030722, 20:10  #5  
Aug 2002
312_{8} Posts 
Quote:


20030723, 09:25  #6 
Jul 2003
31 Posts 
yes that is what seemed so strange.. because if trial factoring is just dividing with all primes in a row we should wait some hundreds of years just to hope to see some factor:)

20030723, 12:37  #7  
"Sander"
Oct 2002
52.345322,5.52471
29×41 Posts 
Quote:


20030723, 13:09  #8 
Jul 2003
37_{8} Posts 
tes have heard this allready.. just thought that some hundreds of years could be enough to see some factor.. if that number is prime then.. i feareven to guess how much time it would ask.. Of course all is dependent how and on what you run the program.. It is almost imposible only to our computers, but i think it is posible to human..

20030728, 02:49  #9 
"Richard B. Woods"
Aug 2002
Wisconsin USA
1E0C_{16} Posts 
A while ago, while writing a posting illustrate the magnitudes of the numbers we work with, I looked up current estimates of the numbers of particles in the known universe, size of the known universe, and related stuff.
Without giving links or specific citations, here is a rough calculation to demonstrate the impossibility of trialfactoring to the square root of the size of Mersenne number with which GIMPS is now working: The known universe could hold (far) less than 10^200 neutrons (the most compact elementary particle) if it were packed full, with no empty space. (In reality, the universe is more than 99.99% empty.) The "Planck time", which is smaller than any time in which any conceivable useful computation could take place, is greater than 10^44 second. Let's make that 10^100 second, just so there's no quibbling. So, no computation could be performed in less than 10^100 second. Suppose the known universe were packed full of neutrons, and suppose each neutron were a computer capable of performing one trialfactoring division in 10^100 second. So, altogether the universe could perform 10^300 trialfactoring divisions per second. Now, 10^300 is less than 2^1200, so all the computers in the entire known universe can perform no more than 2^1200 trialfactoring divisions per second. The estimated age of the known universe is 13 billion years. There are about 31 million seconds in a year. So the universe is about 400 million billion seconds old. That's 4 * 10^17 seconds, which is (much) less than 2^100 seconds. So all the computers in the universe, operating at the fastest possible speed for longer than the age of the universe, could perform no more than 2^1300 trialfactoring divisions. What is the size of a number than has 2^1300 primes below its square root? Let N be the number, and Q be its square root. Pi(Q) = 2^1300 Pi(Q) = approx Q / ln Q {Edit: Here I'm making a WAG ("educated guess":) that Q is about 2^1400 } The natural log of 2^1400 would be less than 1400 (because e^1400 > 2^1400). So if Q were 2^1400, then Pi(Q) would be greater than (2^1400)/1400 > 2^1380. So, we know Q < 2^1400, and so N < 2^2800. In other words, if the entire known universe were packed full of neutrons and each neutron were a computer operating at maximum possible speed, and all the little computers ran for the entire age (so far) of the universe, it could completely trialfactor a number no larger than 2^2800. WAIT! We left out the optimizations  like each factor has to be 2kp+1 and +1 mod 8, and so on. Suppose our optimizations allow us to skip 999,999,999,999 out of every 1,000,000,000,000 primes below the square root of the number we're trying to factor. That means we can TF a trillion (~2^40) times as many potential factors. So we want Pi(Q) = 2^1300 * 2^40 = 2^1340 instead of 2^1300. Hmmm ... looking back, we find that "So if Q were 2^1400, then Pi(Q) would be greater than (2^1400)/1400 > 2^1380" still is valid. We don't have to change our previous answer  The entire universe could TF a number no larger than 2^2800. (Using our current trialfactoring methods and optimizations, that is.) {EDIT: Let me restate that conclusion so that it is clearer when quoted out of context  Even if the entire known universe were one solid computer operating at maximum speed for the entire time since the Big Bang, it could not yet have trialfactored a number larger than 2^2800 all the way to its square root.} Let's see ... how big are the numbers GIMPS is working on now? Current PrimeNet trialfactoring assignments are greater than M21000000 = 2^21000000  1 which is far, far larger than 2^2800. 
20030728, 03:34  #10  
Aug 2002
101000000_{2} Posts 
Quote:


20030728, 07:09  #11 
Aug 2002
2×43×101 Posts 
I've always wondered how the bitdepth of a RSA key affects its "crackability"...
For example, I use a 2048bit key for my sshd session... If I'm thinking about this right that key should be impossible to crack, right? I mean, RC5 took forever to crack 64 bits, right? Or am I looking at this wrong? 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Modifying the Lucas Lehmer Primality Test into a fast test of nothing  Trilo  Miscellaneous Math  25  20180311 23:20 
LucasLehmer test  Mathsgirl  Information & Answers  23  20141210 16:25 
Question on Lucas Lehmer variant (probably a faster prime test)  MrRepunit  Math  9  20120510 03:50 
LucasLehmer test proof etc.  science_man_88  Miscellaneous Math  48  20100714 23:33 
LucasLehmer Test  storm5510  Math  22  20090924 22:32 