mersenneforum.org > Data COMPLETE!!!! Thinking out loud about getting under 20M unfactored exponents
 Register FAQ Search Today's Posts Mark Forums Read

2021-12-12, 02:14   #1002
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

518110 Posts

Quote:
 Originally Posted by Luminescence I also thought of TF in the lower ranges, there is no way they can keep up. 30.8 is leaving GPUs completely behind. 26.5M is in it's final moments and I don't like to see Code: Ranges remaining: 100 or more: 9 Mind if I take down 12.3M? Do you have any probability I should be aiming for?
Thanks ... go for it.

Statistically, Bounds of 3M/600M should give 123 factors.
You need 119; that's cutting it close.

Your two options then are higher bounds ... maybe 5M/1000M should give 144 factors.
Or if you or someone takes it to 73 bits you should get 25 factors via TF.

2021-12-12, 02:28   #1003
Luminescence

Oct 2021
Germany

22×52 Posts

Quote:
 Originally Posted by petrw1 Thanks ... go for it. Statistically, Bounds of 3M/600M should give 123 factors. You need 119; that's cutting it close. Your two options then are higher bounds ... maybe 5M/1000M should give 144 factors. Or if you or someone takes it to 73 bits you should get 25 factors via TF.
Ok, thanks a lot! I'm gonna TF myself in parallel.

2021-12-12, 03:18   #1004
VBCurtis

"Curtis"
Feb 2005
Riverside, CA

5,279 Posts

Quote:
 Originally Posted by petrw1 Thanks ... go for it. Statistically, Bounds of 3M/600M should give 123 factors. You need 119; that's cutting it close. Your two options then are higher bounds ... maybe 5M/1000M should give 144 factors. Or if you or someone takes it to 73 bits you should get 25 factors via TF.
I don't think you're thinking big enough for B2. I'm running some 8.6M expos by agreement with masser, and with 8M/6000M stage 2 time is still lower than stage 1 time. B2 = 200*B1 isn't enough, if you have lots of RAM.
Perhaps those without 32GB+ ram should maybe not do the tough ranges, since they benefit so much from big memory? My timings are with 45GB assigned to P95 on a 64GB/socket machine.

2021-12-12, 03:55   #1005
Luminescence

Oct 2021
Germany

6416 Posts

Quote:
 Originally Posted by VBCurtis I don't think you're thinking big enough for B2. I'm running some 8.6M expos by agreement with masser, and with 8M/6000M stage 2 time is still lower than stage 1 time. B2 = 200*B1 isn't enough, if you have lots of RAM. Perhaps those without 32GB+ ram should maybe not do the tough ranges, since they benefit so much from big memory? My timings are with 45GB assigned to P95 on a 64GB/socket machine.
I've just played around a bit with the probability calculator. I'm gonna look at the runtime with 1.1M/1500M since I can get around 120GB out of three machines. Even with a 1200 B1/B2 ratio, stage 2 was still faster by a few minutes in 26.5M.

Last fiddled with by Luminescence on 2021-12-12 at 03:56

2021-12-12, 04:15   #1006
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by VBCurtis I don't think you're thinking big enough for B2. I'm running some 8.6M expos by agreement with masser, and with 8M/6000M stage 2 time is still lower than stage 1 time. B2 = 200*B1 isn't enough, if you have lots of RAM. Perhaps those without 32GB+ ram should maybe not do the tough ranges, since they benefit so much from big memory? My timings are with 45GB assigned to P95 on a 64GB/socket machine.
I freely admit my B2 is likely low and I probably should have chosen a higher multiple.
I needed to pick a number for my spreadsheet to calculate the expected factors.

On my i5-7820X (8 core) with 24GB RAM allocated GIMPS chooses 256x.

3M/600M (200x) is 8.62%, 123 expected factors and 19.8 GhzDays
3M/1200M (400x) is 9.38%, 139 expected factors and 38 GhzDays

@Luminescence ... a little testing will tell you the B1/B2 you need for your desired percentages.

 2021-12-12, 06:02 #1007 axn     Jun 2003 5×29×37 Posts I'm guessing 1.5m/1.2g (800x) should comfortably get the job done. @Luminescence, how much RAM do you have in a single machine (if different amounts, what's the largest one)?
2021-12-12, 06:48   #1008
Luminescence

Oct 2021
Germany

1448 Posts

Quote:
 Originally Posted by axn I'm guessing 1.5m/1.2g (800x) should comfortably get the job done. @Luminescence, how much RAM do you have in a single machine (if different amounts, what's the largest one)?
Three with 128GB total (don't know how much barebone server Ubuntu 20.04 requires) and one with 64GB (Win10, daily machine)

Last fiddled with by Luminescence on 2021-12-12 at 06:56

2021-12-12, 07:01   #1009
axn

Jun 2003

5·29·37 Posts

Quote:
 Originally Posted by Luminescence Three with 128GB total (don't know how much barebone server Ubuntu 20.04 requires) and one with 64GB (Win10, daily machine)
Ok. I missed it when you had said that earlier as well (3 with 120GB).

Then probably 1m/2g (2000x) should be looked at. The windows machine (64GB) can do 1.5m/1.2g

 2021-12-12, 13:11 #1010 LaurV Romulan Interpreter     "name field" Jun 2011 Thailand 996210 Posts We are, for a little while, back to tickling 11M exponents with P-1. Santa was nice this year. Starting 11.2 and 11.0, in parallel.
 2021-12-12, 13:16 #1011 alpertron     Aug 2002 Buenos Aires, Argentina 1,447 Posts After using 8 years a system based on i5 3470, this week I upgraded my desktop to i5 11400 with 32 GB of RAM. I disabled the turbo boost because the CPU drains too many watts when running 100%, so the processor runs at 2.6 GHz. I populated Prime95 with the the list of worst ranges of P-1 in 10-10.9M from https://www.mersenne.ca/pm1_worst.php. I used the bounds 2M / 1288M (B2 was selected by Prime95). Each number requires 53 minutes. Up to this moment, Prime95 processed 51 numbers and it found only one factorization, M10938773, in step 1. I'm using Prime95 30.8 build 4. It uses 20 GB. Last fiddled with by alpertron on 2021-12-12 at 13:22
2021-12-12, 15:05   #1012
petrw1
1976 Toyota Corona years forever!

"Wayne"
Nov 2006

3·11·157 Posts

Quote:
 Originally Posted by alpertron After using 8 years a system based on i5 3470, this week I upgraded my desktop to i5 11400 with 32 GB of RAM. I disabled the turbo boost because the CPU drains too many watts when running 100%, so the processor runs at 2.6 GHz. I populated Prime95 with the the list of worst ranges of P-1 in 10-10.9M from https://www.mersenne.ca/pm1_worst.php. I used the bounds 2M / 1288M (B2 was selected by Prime95). Each number requires 53 minutes. Up to this moment, Prime95 processed 51 numbers and it found only one factorization, M10938773, in step 1. I'm using Prime95 30.8 build 4. It uses 20 GB.
Wow thanks ... I assure you the average will be much better than 51 per factor
Do I understand you are working on all of 10.0 to 10.9?

 Similar Threads Thread Thread Starter Forum Replies Last Post jschwar313 GPU to 72 3 2016-01-31 00:50 Batalov Factoring 6 2011-12-27 22:40 jasong jasong 1 2008-11-11 09:43 devarajkandadai Math 4 2007-07-25 03:01 WraithX GMP-ECM 1 2006-03-19 22:16

All times are UTC. The time now is 03:18.

Tue May 24 03:18:50 UTC 2022 up 40 days, 1:20, 0 users, load averages: 1.21, 1.20, 1.24