20041002, 16:32  #1  
Jul 2004
Potsdam, Germany
3·277 Posts 
P1 not "equal" to x curves ECM with same bounds?
Hi there!
So long, I thought (but was aware I'm not sure) that P1 factoring finds any factor that is smooth enough to fit into the set bounds, whereas ECM has a certain percentage per run. As a result, it would be useless to do ECM curves after a P1 run with the same bounds. But recently, I reread the gmpecm readme file, that contains: Quote:
P.S.: What are the "benefits" of ECM vs. P1? For one, computation time, of course. I guess memory requirements are lower as well. How is the effort of doing the "expected number of curves" of a certain factor digit count vs. one P1 run? (Of course, this question also depends on the main question of this topic...) Thanks a lot for your help! Last fiddled with by Mystwalker on 20041002 at 16:34 

20041002, 17:44  #2 
"Nancy"
Aug 2002
Alexandria
2,467 Posts 
The confusion partly stems from a sloppy use of the term "smooth".
A number is called Bsmooth if all its prime factors are <=B, or B1,B2smooth if all prime factors are <=B1, except for possibly one >B1, but <=B2. By that definition, the factors we find are almost never "smooth", because they are themselves prime and typically >B2! What has to be smooth is the order of the group we are working in. For the P1 method, the order of the group modulo the hidden prime factor p is p1. Therefore, if p1 is smooth, we find the factor. For ECM, the group order is some number in the interval [p+12*sqrt(p), p+1+2*sqrt(p)]. What particular order we get depends on the sigma parameter of the curve. If we choose sigma at random, the order apparantly is also a randomly chosen integer from that interval (some details omitted). So P1 failing does not imply ECM failing with the same bounds at all  the group order which has to be smooth will be different in both cases. Furthermore, ECM lets us try many different orders by choosing a different sigma each time, until we hit one that is smooth enough. This is the major advantage of ECM over P1: if the factorization of p1 contains even one large prime, the P1 method cannot discover the factor in a reasonable amount of time. With ECM, you can just keep trying different orders. I.e. almost all 50 digit primes are practically impossible to find with P1, but can be discovered with ECM with an (expected) few months of cpu time. P1, otoh, is much faster in stage 1 and has a genuine advantage if a prime q is known to divide p1, i.e. as is the case for factors of Mersenne numbers Mq. The memory requirements in stage 2 are, in theory, virtually the same, however some optimizations (i.e. avoiding a lot of extgcds in ECM which does not apply to P1) benefit from more memory. I think the reason that Prime95 ECM uses much more memory than P1 is that ECM is optimized for speed at the cost of memory, while P1 is written with memory constraints in mind, on the rationale that ECM will be mostly done for small numbers while P1 must work well for the huge numbers that later get tested for primality. Alex 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Stockfish game: "Move 8 poll", not "move 3.14159 discussion"  MooMoo2  Other Chess Games  5  20161022 01:55 
"Master" and "helper" threads  Madpoo  Software  0  20160908 01:27 
AouessareEl HaddouchiEssaaidi "test": "if Mp has no factor, it is prime!"  wildrabbitt  Miscellaneous Math  11  20150306 08:17 
Would Minimizing "iterations between results file" may reveal "is not prime" earlier?  nitai1999  Software  7  20040826 18:12 