operation trillion digits?
Ok, I am thinking of starting an operation trillion digits. I am finding factors form range 3321928094941 to 3321928095989 currently.

:crank:
How do you propose to do PRP/LL tests? What hardware and on it how long do you expect a single test to complete? 
Why? No one is currently interested in these factors. It might seem like fun finding these new factors, but honestly, it is worthless.

[QUOTE=mersenneNoob;579485]Ok, I am thinking of starting an operation trillion digits. I am finding factors form range 3321928094941 to 3321928095989 currently.[/QUOTE]
Please consider helping with 100M1G factoring, or up to 10G, instead. There's too little factoring being done to finish those in our lifetimes. I see no point in committing resources to TF that won't be needed done for over 50 years, probably over 150 years, unless quantum computing delivers bigly and soon. There is to my knowledge no server or database to cover PRP or LL above 1G, or TF or P1 above 10G. [URL]http://www.mersenneforum.org/showpost.php?p=488511&postcount=9[/URL] A single PRP test at OBD takes too long for current hardware, short of some serious supercomputer time. There's no P1 factoring software suitable for OBD yet with completion of one factoring attempt to suitable bounds in a year. There's only one OBD candidate with TF done to adequate depth. A trillion digit Mersenne is ~2,000,000 times slower to primality test or P1 test than an OBD, and so since it currently would take much longer than the usual lifetime of human civilizations, there is no software to attempt it and no point at this time to create software for a futile attempt. [URL]https://www.mersenneforum.org/showpost.php?p=574636&postcount=14[/URL] If Moore's law persisted at 2 year doubling, it would take 42 years for a trillion digit Mersenne to become "only" as much a challenge as OBD are today, requiring years on the fastest software and consumermarket GPU for one PRP test. That 42 years is most of or beyond the remaining life expectancy of most GIMPS members. There comes a point in the number line where it's not even a matter of waiting for Moore's law to make it more tractable, since feature shrink will stall out before reaching atomic size limit and already stopped providing clock rate increase ~15 years ago, & where there's not enough available mass on which to store the interim residues for P1 factoring or PRP even at 10 bits per particle. (Attempting LL without some yet to be demonstrated bullet, cannonball, and nukeproof error detection and correction would be insane. It's also pretty sketchy at 100Mdigit, and almost certain of error at 1G.) 
[QUOTE=mersenneNoob;579485]Ok, I am thinking of starting an operation trillion digits. I am finding factors form range 3321928094941 to 3321928095989 currently.[/QUOTE]Go ahead if that's what floats your boat. It's your time, your hardware and your power bill.
As others have pointed out, the likelihood of finding a prime in your lifetime is somewhere between nil and negligible, and well towards the lower end of that range in my opinion. It is possible that a theoretical breakthrough might be able to pinpoint primes but it is almost certain that trial factorization will not be of assistance to an hypothetical proof. 
[QUOTE=xilman;579495]It is possible that a theoretical breakthrough might be able to pinpoint primes but it is almost certain that trial factorization will not be of assistance to an hypothetical proof.[/QUOTE]Only if there are several breakthroughs in quantum computing would testing of this be practical and TF would be useful.

Million, billion, trillion, what the hell is the difference, right? "Why not start designing a living hut for people who will colonize Uranus  already today!?"
For my part, I can promise that to be in trend, I will start "operation quadrillion digits" tomorrow, too, and I can actually promise several hundred million factors reported in the first day of running alone. Seriously though, Steven Wright was right! [B]"You can't have everything. Where would you put it?"[/B] Print it, frame it, look at this maxim everyday, in the mornings. 
[QUOTE=Uncwilly;579504]Only if there are several breakthroughs in quantum computing would testing of this be practical and TF would be useful.[/QUOTE]Not necessarily.
A proof may be forthcoming that all Mersenne numbers with exponents of a particular form must be prime and all others must be composite. Likewise, there may be a proof that M_n is composite for all n greater than an explicit bound. If that bound happens to be less than log_2(10^{12}) ... I don't expect either theorem to be proven any time soon. if either is proven, the proposed computational effort will be wasted 
There appears to be plenty of TF work available seeking small factors of 2^p  1 for p currently under consideration by GIMPS.
There is also plenty of work available to try to crack 2^p  1 which are known to be composite, but have no known prime factors; or remaining composite cofactors of partiallyfactored 2^p  1. As a personal hobby, doing TF for larger p is unobjectionable, but that's about all you'll achieve. It wouldn't be of any use to GIMPS. For exponents of order 10[sup]12[/sup], PRP or LL tests are out of the question for the foreseeable future. Before looking at trilliondigit numbers for which all you'll ever be able to determine is whether they have really small factors, it might be nice to crack, say, 2^1277  1, a known composite of 385 decimal digits. 
[QUOTE=Batalov;579506]people who will colonize Uranus [/QUOTE]
my what?? :w00t: Maybe yours! 
[QUOTE=LaurV;579666][QUOTE=Batalov;579506]people who will colonize Uranus [/QUOTE]my what?? :w00t:
Maybe yours![/QUOTE]:paul: Not "people." "Klingons!" 
All times are UTC. The time now is 00:26. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.