mersenneforum.org memory usage in stage 2 of P-1 factoring
 Register FAQ Search Today's Posts Mark Forums Read

 2003-09-06, 22:03 #1 gckw   Sep 2003 38 Posts memory usage in stage 2 of P-1 factoring I was reading the readme.txt and it said the stage 2 of P-1 factoring would be more efficient if there's more memory allocated for use. How does it actually use the memory more efficiently, such as using 512mb instead of 8mb of memory. Does the factoring run faster? Also, it stated "If at all in doubt, leave the settings at 8MB. The worst that will happen is you end up running a Lucas-Lehmer primality test when stage 2 of P-1 factoring would have found a factor." Does it mean that using less memory would yield of a higher probability of NOT finding a factor?
2003-09-06, 22:30   #2
ET_
Banned

"Luigi"
Aug 2002
Team Italia

2·2,417 Posts

Quote:
 Does it mean that using less memory would yield of a higher probability of NOT finding a factor?
Sort of.

Stage 2 of P-1 factoring allocates memory to "make comparisons" and try to find a factor. The more memory you devote to stage 2, the more the chance to find a factor grows. But AFAIK we're talking of 1-2% of probability.

Luigi

2003-09-07, 00:58   #3
ewmayer
2ω=0

Sep 2002
República de California

1167610 Posts

Quote:
Originally Posted by ET_
Quote:
 Does it mean that using less memory would yield of a higher probability of NOT finding a factor?
Sort of.

Stage 2 of P-1 factoring allocates memory to "make comparisons" and try to find a factor. The more memory you devote to stage 2, the more the chance to find a factor grows. But AFAIK we're talking of 1-2% of probability.

Luigi
No, what is happening is that stage 2 precomputes a set of even powers of the stage 1 residue R, e.g. R^2, R^4, etc. If your stage 2 primes limit is B2 and the largest gap between primes &lt; B2 is no more than your largest precomputed power, then you need only do one modular multiply by one of your precomputed powers to cover the gap between consecutive primes. If you have less memory, you'll need to occasionally do an extra mul by two of precomputed powers of R to cover a large gap between primes, which costs extra CPU time. But the percentage of such extra muls decreases rapidly with extra allocated memory, so at some point (perhaps a few dozen precomputed powers, say enough to go up to R^50) you get rapidly diminishing returns. R^50 would imply 25 precomputed even powers of the stage 1 residue R, so for a 1024K-length FFT that would be 8MB per power, i.e. around 200MB. In other words, the difference between 256MB and 512MB allocated memory is likely to be slight.

 2003-09-07, 06:56 #4 patrik     "Patrik Johansson" Aug 2002 Uppsala, Sweden 52·17 Posts And the reason the probablity of finding a factor increases is that the program choses higher bounds B1 and B2 to factor to when it is allocated more memory and can run more effectively.

 Similar Threads Thread Thread Starter Forum Replies Last Post Antonio Software 6 2012-09-04 12:48 gamer30 Software 17 2012-08-23 20:02 lidocorc Software 2 2008-11-03 02:35 James Heinrich Software 5 2005-03-22 20:05 Xyzzy Factoring 3 2003-08-23 21:10

All times are UTC. The time now is 12:22.

Mon Dec 6 12:22:02 UTC 2021 up 136 days, 6:51, 0 users, load averages: 1.31, 1.41, 1.36

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.