mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Number Theory Discussion Group (https://www.mersenneforum.org/forumdisplay.php?f=132)
-   -   How are composite Mersenne's sieved (weeded) out? (https://www.mersenneforum.org/showthread.php?t=25033)

jshort 2019-12-15 17:34

How are composite Mersenne's sieved (weeded) out?
 
Is there a sieving process that is done before the Lucas-Lehmer primality test is run to test if a Mersenne is prime?

Normally (that is for random integers), one simply uses a primordial number --- ex, product of all primes up to say 10,000, and then compute [TEX]gcd(n,p)[/TEX] where [TEX]n[/TEX] is the primordial up to 10,000 and [TEX]p[/TEX] is some integer whose primality we wish to determine.

However when it comes to Mersenne's this simple approach doesn't work too well I imagine since for a given Mersenne [TEX]M_{p} = 2^{p} - 1[/TEX], we know that any possible prime factor [TEX]q[/TEX] is going to have to have the form [TEX]1 + k \cdot 2 \cdot p[/TEX], which effectively guarantees that almost any Mersenne is going to pass this kind of weeding/screening out process since [TEX]gcd(n,2^{p} - 1) = 1[/TEX] almost all the time.

Of course I suppose if one is using an extremely large primordial like say up to [TEX]10^{12}[/TEX], this procedure could have a bit of merit since you've essentially got yourself a polynomial-time factoring algorithm with [TEX]10^{12}[/TEX] as your upper limit.

As an aside, I noticed that the Lucas-Lehmer primality test can sort of be used as a bit of a Pollard-rho factoring algorithm, with [TEX]x_{1} = 4[/TEX] and [TEX]a = -2[/TEX].

For example, if we were determining the primality of the Mersenne [TEX]M = 2^{23} - 1[/TEX], the LL-test says we need to calculate [TEX]x_{22}[/TEX] over [TEX]Z_{M}[/TEX] where [TEX]x_{1} = 4[/TEX] and [TEX]x_{n+1} = x_{n}^{2} - 2[/TEX]. However if you stop at [TEX]x_{12}[/TEX] and compute [TEX]gcd(x_{12} - x_{1}, 2^{23} - 1)[/TEX] you obtain the prime factor 47 and the LL-test no longer needs to continued since we know now that [TEX]2^{23} - 1[/TEX] is not prime.

Batalov 2019-12-15 18:26

[url]https://www.mersenne.org/various/math.php[/url]

jshort 2019-12-15 22:48

[QUOTE=Batalov;532997][url]https://www.mersenne.org/various/math.php[/url][/QUOTE]

Thank you sir.

Interesting that the p-1 test isn't used straight away, but rather a trivial Sieve of Eratosthenes.

Anyways, I'd imagine that the bounds B and B' must be awfully small.

kriesel 2019-12-16 02:22

[QUOTE=jshort;533013]Interesting that the p-1 test isn't used straight away, but rather a trivial Sieve of Eratosthenes.

Anyways, I'd imagine that the bounds B and B' must be awfully small.[/QUOTE]The order of TF first, lowest bits first, P-1 later is the outcome of doing first, what gives the most factored per unit effort, and so maximizing time saved.
[URL]https://www.mersenneforum.org/showpost.php?p=508523&postcount=6[/URL]
P-1 bounds are typically optimized to maximize probable time saved, also.
[URL]https://www.mersenneforum.org/showpost.php?p=522257&postcount=23[/URL]
An appropriate set of P-1 bounds is a rather lengthy computation compared to the beginning TF levels.

CRGreathouse 2019-12-16 07:21

[QUOTE=jshort;532992]Normally (that is for random integers), one simply uses a primordial number --- ex, product of all primes up to say 10,000, and then compute [TEX]gcd(n,p)[/TEX] where [TEX]n[/TEX] is the primordial up to 10,000 and [TEX]p[/TEX] is some integer whose primality we wish to determine.

However when it comes to Mersenne's this simple approach doesn't work too well I imagine since for a given Mersenne [TEX]M_{p} = 2^{p} - 1[/TEX], we know that any possible prime factor [TEX]q[/TEX] is going to have to have the form [TEX]1 + k \cdot 2 \cdot p[/TEX], which effectively guarantees that almost any Mersenne is going to pass this kind of weeding/screening out process since [TEX]gcd(n,2^{p} - 1) = 1[/TEX] almost all the time.[/QUOTE]

Right. As an improvement you could use a product of just the numbers of the appropriate form, but the numbers are just so big that taking a single gcd is very time-consuming. If you're doing a double-check the numbers are about 50 MB, and if you gather together 50 MB worth of factors (this would be about 8 million) you'd need tens of seconds, or ~10^4 clock cycles per factor. Trial division -- or rather the specialized operation that we call trial division in this context -- is much more efficient.


All times are UTC. The time now is 20:04.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.