mersenneforum.org > Data Let's Optimize P-1 for low exponents. TL;DR in post #1. More in posts 60 and 61.
 Register FAQ Search Today's Posts Mark Forums Read

2022-01-17, 18:52   #56
firejuggler

"Vincent"
Apr 2010
Over the rainbow

1011000101112 Posts

Quote:
 Originally Posted by firejuggler I can give you timing for my working range 3core/1 worker 10 Gb of mem 8.5M/1.56M: 448k/ 512k : 1550 sec/1000 sec

Amend that to 1200/800 second, my system was busy.

 2022-01-17, 19:44 #57 firejuggler     "Vincent" Apr 2010 Over the rainbow 17×167 Posts 4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD 8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD 17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD wich feel, about the same? Last fiddled with by firejuggler on 2022-01-17 at 20:23 Reason: Adding the 17M stats
2022-01-17, 21:36   #58
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

518110 Posts

Quote:
 Originally Posted by firejuggler 4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD 8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD 17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD wich feel, about the same?
Looks like 2x (1.95x) fits your PC pretty good.
We are probably in the ball park with 2x ... or 2.2x ... or 1.9x

 2022-01-18, 04:32 #59 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 10100001111012 Posts Signing off until April So doing some rough calculations using this which admittedly is not accurate for v30.8 ... but still usable. Looking at exponents where the current B1 is even equal to the recommended B1; but because the new B2 is SOOOO much higher I'm seeing odds of finding a factor close to 10% higher. I realize now though that I cannot only look at current B1 vs new B1; I also have to look at the current B2 to be sure it is NOT from a 30.8 run (in other words many thousands of times larger than the current B2 rather than 20x or 30x.) So we would still start with the exponents where the current B1 is the smallest ratio of the new B1 and work down past 10x and even past 5x ... where the current B2 is not VERY VERY BIG. In any case this could be my last post with any cyphering before I return late March. ====================== For those who want to give a try the best I can suggest is using post #23 for suggested B1 values ... it uses this formula referred to in post #39: 2.2^LOG(20,000,000/,2)*1,000,000 ... and post #46 for B1 adjustment where available RAM is somewhat lower or higher than 16GB. I'm leaning towards the top table that equalizes the factor success percent. Thanks and I'll check back in April ... at which time I'll clean up the recommendations and hopefully start myself. Wayne
 2022-03-13, 21:30 #61 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 3×11×157 Posts Related work opportunities 1. Some of you want to try to get all 10K ranges under 200 factors remaining as a follow-up to the current Under 2000 project. This is not contrary to the discussion in the previous post. It only requires that you may need to choose even higher B1 values for these 10K ranges of interest and/or TF more. 2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored. 3. So how can GPUs contribute? I'm not sure how they can help with deep P1, at least until GPUOwl or other GPU P1 software has been retrofitted to 30.8 functionality. A couple thoughts: a. Tidy up the TF for all the lower ranges, bringing all exponents to the same appropriate TF level. mikr has been systematically TF'ING all lower exponents to 71 bits b. Help those in point 1 above get ranges under 200 factors. c. Mainstream leading edge TF for PRP work.
2022-03-14, 19:37   #62
nordi

Dec 2016

7·17 Posts

Quote:
 Originally Posted by petrw1 2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored.
Such a list can be generated with the https://www.mersenne.ca/morefactors.php page.

I'm currently working on the 12.4M range for already factored exponents, because they will soon be PRP-checked by the folks doing PRP-C work. Every factor that I find now (instead of later) means one less PRP check is needed. If someone wants to take the 12.5M range, I'd be happy to share. ;-)

 2022-03-21, 14:05 #64 petrw1 1976 Toyota Corona years forever!     "Wayne" Nov 2006 Saskatchewan, Canada 143D16 Posts @lisanderke Looks good to me. Thanks for the explanation and the help.
 2022-03-23, 01:27 #65 DrobinsonPE   Aug 2020 13710 Posts Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000. I will start by running a test with the assignment below to make sure it works and see how long it takes and then start generating Pminus1 assignments for the rest of the range. Pminus1=N/A,1,2,4200109,-1,6200000,0,72 Let me know if someone else is already working here or if there is a better place to start.
2022-03-23, 03:41   #66
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by DrobinsonPE Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000.
Thanks, enjoy, keep us posted.
I have no problem with your proposed strategy.

Just some food for thought for you or others...
When i initially started analyzing these low exponents for candidates I too thought I should skip any where the current B1 is more than half of the proposed B1.

When I realized the new B2 is soooo much higher than the current B2, even rerunning exponents with the same B1 may have a 3% - 4% success rate.

So, an alternative strategy might be to look for exponents where the current B2 is less than some percentage of the new B2. You may need to run a test to determine your new B2 since it is very RAM sensitive.

According to the prob.php function at mersenne.ca a 10x increase in B2 adds about 2% to the success rate.

Thanks

 Similar Threads Thread Thread Starter Forum Replies Last Post Ilya Gazman Factoring 6 2020-08-26 22:03 kladner Lounge 3 2018-10-01 20:32 gd_barnes No Prime Left Behind 6 2008-02-29 01:09 jasong Marin's Mersenne-aries 7 2006-12-22 21:59 GP2 Software 10 2003-12-09 20:41

All times are UTC. The time now is 20:19.

Mon May 23 20:19:19 UTC 2022 up 39 days, 18:20, 1 user, load averages: 1.69, 1.71, 1.68