(Preying for) World Record P1!
I'd like to organize a thread to find World Record Factors with P1 on small exponents. I'll now refer to this project as WR P1 (at the risk of it sounding too pretentious, perhaps :lol:) That would make this a subsubproject of GIMPS! Similar to Petrw1's <2k (sub twok, <twok...) project, although that name is much catchier than WR P1 in my opinion :razz:
All of these small exponents will need to be P1'ed to very high bounds. For now, it's best to only run stage 1 on these exponents (see quoted post below). However, this is where I'm stuck. The maths behind P1 escapes me. Let alone the conversion from previous ECMGMP work done to what that could equate to in terms of P1 bounds ("previously done"). [QUOTE=Prime95;595226]There will not be lots of easy to find factors. GMPECM has been run on these expos  likely to pretty big bounds. That's not to say it wouldn't be a good project to redo P1 on all small exponents. Suggestion: Remember that prime95 is not multithreaded for small exponents. So, set up a worktodo.txt with work for every core. Just run stage 1, go deep. You're accumulating stage 1 results for a stage 2 run at a later date. There are more advances coming for stage 2 P1 on small expos (better multithreading). When that version is ready, switch prime95 to one worker using all cores. Feed it all those stage 1 results you've accumulated. BTW, a catchy project name (like petrw1's 20M project) and coordinating thread might be a good idea![/QUOTE] With all of that out of the way, I need help devising up proper bounds for all of these ranges, exponents... As I've said before, the math escapes me, but hopefully I can learn from this project! What I can do for now, however, is draft up some way of organizing this effort. Here goes nothing: I suggest we limit our effort to the 1M range for the foreseeable future. Subsequently, our current efforts should be focused on the 100k range. (Once this range has been completed, we take the next 100k range, and so on...) Within this 100k range, I'd like it to be possible to 'reserve'/claim certain ranges, with the smallest range being 1k. [B]I'd like to reserve the 5k range (all exponents from 5000 to 6000)[/B] I'll attempt to take this range to B1=2 trillion (2,000,000,000,000 (as suggested by Zhangrc) For a bit of backstory, I realized that lots of very small exponents that have already been factored, but are not cofactor PRP, (starting from exponent M[M]4013[/M]) never had any P1 done to date. This prompted me to find more of these exponents that never had any P1 done previously. Upon my request, James Heinrich added a checkbox to [URL]https://www.mersenne.ca/morefactors.php[/URL] to find these exponents that have had no P1 done before. Though later it was pointed out that these exponents have had extensive GMPECM done, and any P1 done would have to be to very high bounds to have a crack at finding more factors. Credit to Zhangrc for coming up with the name for this project! [QUOTE=Zhangrc;595237]Maybe … The title should be "Fighting for World Record P1." That is really going to find world record size factors. Maybe something like this: [URL="https://www.mersenne.ca/prob.php?exponent=9007&factorbits=95&b1=2000000000000&b2=1.0E%2B17"]https://www.mersenne.ca/prob.php?exponent=9007&factorbits=95&b1=2000000000000&b2=1.0E%2B17[/URL][/QUOTE] 
Post reserved for factors found within this projects active ranges.

Taking George Woltman's advice, let's wait with starting stage 2 on these exponents. For now, I'd like to open up discussion on what B1 would be sufficient to take these ranges up to. Zhangrc suggested the following bound:
[url]https://www.mersenne.ca/prob.php?exponent=9007&factorbits=95&b1=2000000000000&b2=1.0E%2B17[/url] 
[QUOTE=lisanderke;595273]Taking George Woltman's advice, let's wait with starting stage 2 on these exponents. For now, I'd like to open up discussion on what B1 would be sufficient to take these ranges up to. Zhangrc suggested the following bound:
[url]https://www.mersenne.ca/prob.php?exponent=9007&factorbits=95&b1=2000000000000&b2=1.0E%2B17[/url][/QUOTE] For 5000 to 6000, you can use even larger bounds, say 3e12 to 4e12 (as long as you can bear with the run time) Then you assume no factor below 2^95 and let Prime95 automatically decide the B2 bounds for you. 
I don't know if there is a "correct" B1 value. It may simply depend on how much patience you have.
FWIW, my aging quad core can compute 4 exponents around 80K to B1=300M in just under 2 hours. Extrapolating I should be able to take four 5K exponents to B1=1T in about 17 days. A pretty significant effort considering the number of exponents to work on. I'll reserve the 79000 to 82000 area for P1. I've been doing B1=300M and annoyingly this range has not given me a new factor yet. I think I'll try B1=3 to 5 billion. @James: Can the P1 probability calculator be changed to allow more than 95 bits of TF? Or even better, estimate the proper TF value given the amount of ECM that's been done? If anyone has old P1 files to donate that can be used as a starting point please let us know. We could set up a repository for future P1 efforts. 
Try to keep the following in mind while doing large B1. Although, 2e12 seems so large that it would barely make any difference, I guess.
[QUOTE=undoc.txt]If doing P1 with a very large B1, the program uses about 47MB of memory to precalculate the product of all small primes below 250 million. The primes above 250 million are processed using a somewhat slower method. You can elect to use more memory to move the 250 million threshold higher. In prime.txt enter: MaxStage0Prime=n (n is in millions, default is 250)[/QUOTE] 
First off: I'm not able to keep editing my posts (mainly the first and second one in this thread), but given the nature of this thread, I'd like to request the ability to do so. It might be much easier to keep track of reservations if it's all being done in one post. Should I contact someone specifically to request this permission if it's a possibility? If not, I could set up a google sheet to keep track of reservations.
I've spread out my worktodo with exponents on each core (6 cores), B1=2T and it seems like this would take 28 days per exponent (with B1 only). There are a little more than 100 exponents (with factors found) and only three exponents with no factors found in the 5k range... B1=200B takes it down to three days and change, B1=20B takes it down to seventeen hours, B1=2B becomes two hours. I'll start out with taking all exponents in 5k to 2B, which for almost half will be the first P1 done (though this is not an accurate indicator of further factoring success ;) ) Until 30.8 (or perhaps a further version) comes along with multithreaded P1 on small exponents, I'll hold off on doing stage 2 on these exponents. Instead, further increasing B1 with every round seems like the best option for now. I might be able to muster enough patience to bring B1 up with 1 days worth of work per exponent. @axn, Do you know how the memory usage increases with a higher prime threshold? I assume it doesn't scale linearly in the sense that 500 million would become 100 MB? @Prime95 Hopefully 3B to 5B gives you some factors! As mentioned above, it might be a bit hard to keep track of reservations for a bit but I've noted you down for 79k, 80k and 81k (if I understood correctly) 
[QUOTE=lisanderke;595288]@axn, Do you know how the memory usage increases with a higher prime threshold? I assume it doesn't scale linearly in the sense that 500 million would become 100 MB?[/QUOTE]
It should be very nearly linear growth. However, if you're planning on doing B1>1e12, it probably doesn't matter; > 99% of the stage 1 will be using the slower method. Also note that calculating that product itself will take some time, so even if you could (because you have the RAM), you shouldn't try to go too high with that parameter. Practically, you might set that if you're attempting, say, B1<1e10 EDIT: Note that if you take a number to some B1, and later on increase it, it will use the slower method, Try to go directly to the largest B1 that you can comfortably do with the faster method (say, 1e10) 
I really like that idea, to me finding new factors of small Mersenne numbers is more interesting than factors of very large Mersenne numbers.
Why is it considered better to just do B1 at first? I there a deeper mathematical/efficiency reason for it, or is it just because new features for stage 2 will come and we want to make use of them? How exactly are stage 1 results fed to mprime? Do I just keep the files from a B1=B2 run and mprime will automatically notice when I do that exponent again with B2 > B1? 
[QUOTE=bur;595365]Why is it considered better to just do B1 at first? I there a deeper mathematical/efficiency reason for it, or is it just because new features for stage 2 will come and we want to make use of them?[/quote]
The latter. George is still fine tuning the software for extreme B2. There is one more reason. Stage 2 runs well when you give it more RAM, so currently the software will only run 1 stage 2 at a time, but it can be run multithreaded. However, stage 1 on small exponents can't be run multithreaded. So to make optimal use of all cores, you run "n" stage 1 in parallel, and then one by one do the stage 2 using all cores. [QUOTE=bur;595365]How exactly are stage 1 results fed to mprime? Do I just keep the files from a B1=B2 run and mprime will automatically notice when I do that exponent again with B2 > B1?[/QUOTE] Yep. 
[QUOTE]So to make optimal use of all cores, you run "n" stage 1 in parallel, and then one by one do the stage 2 using all cores.[/QUOTE]that's good to keep in mind, thanks!
[QUOTE]Instead, further increasing B1 with every round seems like the best option for now.[/QUOTE]I'm not sure I got that right, but is the plan to run consecutive P1 with increasing bounds on the same exponent? Isn't that a waste of the previous runs? 
All times are UTC. The time now is 08:09. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2022, Jelsoft Enterprises Ltd.