mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Data

Reply
 
Thread Tools
Old 2022-01-17, 18:52   #56
firejuggler
 
firejuggler's Avatar
 
"Vincent"
Apr 2010
Over the rainbow

1011000101112 Posts
Default

Quote:
Originally Posted by firejuggler View Post
I can give you timing for my working range
3core/1 worker 10 Gb of mem

8.5M/1.56M: 448k/ 512k : 1550 sec/1000 sec

Amend that to 1200/800 second, my system was busy.
firejuggler is online now   Reply With Quote
Old 2022-01-17, 19:44   #57
firejuggler
 
firejuggler's Avatar
 
"Vincent"
Apr 2010
Over the rainbow

17×167 Posts
Default

4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD
8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD
17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD


wich feel, about the same?

Last fiddled with by firejuggler on 2022-01-17 at 20:23 Reason: Adding the 17M stats
firejuggler is online now   Reply With Quote
Old 2022-01-17, 21:36   #58
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

518110 Posts
Default

Quote:
Originally Posted by firejuggler View Post
4.2M/3M: 224K/ 240k : 1378/850 sec 26.8341 GHzD
8.5M/1.56M: 448k/ 512k : 1200 sec/800 sec 15.0849 GHzD
17M/800k: 896k/1M: 1200/866 sec 8.4054 GHzD


wich feel, about the same?
Looks like 2x (1.95x) fits your PC pretty good.
We are probably in the ball park with 2x ... or 2.2x ... or 1.9x
petrw1 is offline   Reply With Quote
Old 2022-01-18, 04:32   #59
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

10100001111012 Posts
Default Signing off until April

So doing some rough calculations using this
which admittedly is not accurate for v30.8 ... but still usable.

Looking at exponents where the current B1 is even equal to the recommended B1;
but because the new B2 is SOOOO much higher I'm seeing odds of finding a factor close to 10% higher.

I realize now though that I cannot only look at current B1 vs new B1;
I also have to look at the current B2 to be sure it is NOT from a 30.8 run (in other words many thousands of times larger than the current B2 rather than 20x or 30x.)

So we would still start with the exponents where the current B1 is the smallest ratio of the new B1 and work down past 10x and even past 5x ... where the current B2 is not VERY VERY BIG.

In any case this could be my last post with any cyphering before I return late March.

======================

For those who want to give a try the best I can suggest is using post #23 for suggested B1 values ... it uses this formula referred to in post #39: 2.2^LOG(20,000,000/<exponent>,2)*1,000,000
... and post #46 for B1 adjustment where available RAM is somewhat lower or higher than 16GB.
I'm leaning towards the top table that equalizes the factor success percent.

Thanks and I'll check back in April ... at which time I'll clean up the recommendations and hopefully start myself.
Wayne
petrw1 is offline   Reply With Quote
Old 2022-03-13, 21:10   #60
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

120758 Posts
Default As we prepare to start this project here are my suggestions

I openly welcome any opinions or counter points.

(Note: the next post discusses related work opportunities)

If you haven't or prefer not to read all the above posts.
In a nutshell we want to more aggressively P1 lower exponents using the impressive speed of version 30.8...and factor lots more exponents: You need to be at that version to adequately participate here.

I suggest we start at the lower end (< 10M?) and work our way higher as long as it is still productive and interesting.

If you are interested you need to "reserve" a range of exponents and manually generate the appropriate Pminus1 assignments... unless someone smarter than me can automate it. Start with exponents in your range that have the current lowest bounds. But since the B2 that will be used is Sooo much higher than in previous versions it will be beneficial to reprocess most exponents in most ranges.

I'll generate a list of the most fruitful ranges later in March when I'm home.
In the meantime you're on your own... you can do it!

Chris is working on a supporting GPU72 chart to help as he did with Under 2000.

==============

Proposed minumum B1 values in the table below based on about 16GB RAM allocated.
Or calculate for your exponent range using:
Code:
2.2^LOG(20,000,000/<exponent>,2)*1,000,000
If your RAM is somewhat different it is suggested that B1 be adjusted with this function:
Code:
sqrt( 16 / your GB RAM) × proposed B1
This adjustment is more important if you have less than 16GB.

Code:
[Exponent	B1		B1-Neat
78125		 548,758,735 	 548,800,000 
156250		 249,435,789 	 249,400,000 
312500		 113,379,904 	 113,400,000 
500000		 66,427,649 	 66,400,000 
625000		 51,536,320 	 51,500,000 
750000		 41,883,644 	 41,900,000 
1000000		 30,194,386 	 30,200,000 
1250000		 23,425,600 	 23,400,000 
1500000		 19,038,020 	 19,000,000 
2000000		 13,724,721 	 13,700,000 
2500000		 10,648,000 	 10,600,000 
3000000		 8,653,645 	 8,700,000 
4000000		 6,238,510 	 6,200,000 
5000000		 4,840,000 	 4,800,000 
6000000		 3,933,475 	 3,900,000 
7000000		 3,300,838 	 3,300,000 
8000000		 2,835,686 	 2,800,000 
9000000		 2,480,116 	 2,500,000 
10000000	 2,200,000 	 2,200,000 
11000000	 1,973,960 	 2,000,000 
12000000	 1,787,943 	 1,800,000 
13000000	 1,632,344 	 1,600,000 
14000000	 1,500,381 	 1,500,000 
15000000	 1,387,133 	 1,400,000 
16000000	 1,288,948 	 1,300,000 
17000000	 1,203,057 	 1,200,000 
18000000	 1,127,325 	 1,100,000 
19000000	 1,060,082 	 1,100,000 
20000000	 1,000,000 	 1,000,000

Last fiddled with by petrw1 on 2022-03-13 at 21:32 Reason: Chris chart
petrw1 is offline   Reply With Quote
Old 2022-03-13, 21:30   #61
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

3×11×157 Posts
Default Related work opportunities

1. Some of you want to try to get all 10K ranges under 200 factors remaining as a follow-up to the current Under 2000 project. This is not contrary to the discussion in the previous post. It only requires that you may need to choose even higher B1 values for these 10K ranges of interest and/or TF more.

2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored.

3. So how can GPUs contribute? I'm not sure how they can help with deep P1, at least until GPUOwl or other GPU P1 software has been retrofitted to 30.8 functionality. A couple thoughts:
a. Tidy up the TF for all the lower ranges, bringing all exponents to the same appropriate TF level. mikr has been systematically TF'ING all lower exponents to 71 bits
b. Help those in point 1 above get ranges under 200 factors.
c. Mainstream leading edge TF for PRP work.
petrw1 is offline   Reply With Quote
Old 2022-03-14, 19:37   #62
nordi
 
Dec 2016

7·17 Posts
Default

Quote:
Originally Posted by petrw1 View Post
2. Some have asked if we will only process unfactored exponents or if we should also use version 30.8 to further factor currently factored exponents. I have no issue with doing so though I'm not sure how to generate the lists of exponents already factored.
Such a list can be generated with the https://www.mersenne.ca/morefactors.php page.

I'm currently working on the 12.4M range for already factored exponents, because they will soon be PRP-checked by the folks doing PRP-C work. Every factor that I find now (instead of later) means one less PRP check is needed. If someone wants to take the 12.5M range, I'd be happy to share. ;-)
nordi is offline   Reply With Quote
Old 2022-03-21, 09:52   #63
lisanderke
 
"Lisander Viaene"
Oct 2020
Belgium

109 Posts
Default

I'd like to reserve the 2M (factored) range and take some of the exponents that have never had P-1 done in that range to B1=15M (and let 30.8 decide B2). Perhaps afterwards I'll take some of the exponents with only B1=B2 done to a low B1 to the same B1=15M.

For others like me, who want to do P-1 on already factored exponents, I suggest taking the following approach:

First, navigate to this site: https://www.mersenne.ca/morefactors.php. Pick a range (xM or x.xM) where there are a lot of exponents that have never had P-1 results reported. With relatively little effort (anywhere from 2 min to 1 hr work runs) these can give lots more factors. (5% to 15% reported probability to find a factor, if no factor had been previously found. My luck has been higher in the past few days.)

Select a B1 that you'd be comfortable with taking all of the factored exponents in that range to. Note that some smaller exponents might have faster runtimes if you finish stage 1 single core, and subsequently run stage 2 on multiple cores (with the most amount of RAM allocated). This involves a lot of fiddling around with worktodo files though, and I got myself confused a couple of times already trying to figure out what batch of exponents have had stage 1 done and what batch hadn't!

After you've taken all of the exponents that previously had no P-1 done to (un)reasonable B1 and B2 bounds, if you'd like to stick to that range, you should be able to sort by exponents that had P-1 done with B1=B2 (no stage 2). Many of those exponents might have very low B1 done (50k, which is the minimum on Prime95), so doing high B1 and B2 on these exponents should net a lot of factors too!

TL;DR
- Pick a range (xM or x.xM, <10M?) to do P-1 work in on factored exponents
- Select a B1 for that range (use table above for reference, or decide based on your own preference)
- Take exponents with no previous P-1 to those bounds, and do stage 2 with 30.8
- Take exponents with B1=B2 to higher B1 and higher still B2 bounds
- ...
- Profit (hopefully lots more factors for co-factor PRP'ing those exponents)


Again, this is just my suggestion! I know most won't do work on already factored exponents, but at the rate that I've been finding factors recently it seemed worth a shot to pitch my strategy to others :D. Please comment and/or give advice!

Last fiddled with by lisanderke on 2022-03-21 at 10:03
lisanderke is offline   Reply With Quote
Old 2022-03-21, 14:05   #64
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

143D16 Posts
Default

@lisanderke
Looks good to me.
Thanks for the explanation and the help.
petrw1 is offline   Reply With Quote
Old 2022-03-23, 01:27   #65
DrobinsonPE
 
Aug 2020

13710 Posts
Default

Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000.

I will start by running a test with the assignment below to make sure it works and see how long it takes and then start generating Pminus1 assignments for the rest of the range.

Pminus1=N/A,1,2,4200109,-1,6200000,0,72

Let me know if someone else is already working here or if there is a better place to start.
DrobinsonPE is offline   Reply With Quote
Old 2022-03-23, 03:41   #66
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

3·11·157 Posts
Default

Quote:
Originally Posted by DrobinsonPE View Post
Just because I like the number 42, I would like to claim the 4.2M range. It looks like there are currently 1992 unfactored exponents. I will be using an i3-9100 with 16GB ram so that will take me a while to complete. According to the table the stage 1 B1-Neat is 6,200,000 and mprime 30.8 can chose the stage 2. For now I will skip any exponents that have a stage 1 above 6,200,000.
Thanks, enjoy, keep us posted.
I have no problem with your proposed strategy.

Just some food for thought for you or others...
When i initially started analyzing these low exponents for candidates I too thought I should skip any where the current B1 is more than half of the proposed B1.

When I realized the new B2 is soooo much higher than the current B2, even rerunning exponents with the same B1 may have a 3% - 4% success rate.

So, an alternative strategy might be to look for exponents where the current B2 is less than some percentage of the new B2. You may need to run a test to determine your new B2 since it is very RAM sensitive.

According to the prob.php function at mersenne.ca a 10x increase in B2 adds about 2% to the success rate.

Thanks
petrw1 is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to optimize the sieving stage of QS? Ilya Gazman Factoring 6 2020-08-26 22:03
Placeholder: When is it legal to torrent BBC tv stuff? kladner Lounge 3 2018-10-01 20:32
Future project direction and server needs synopsis gd_barnes No Prime Left Behind 6 2008-02-29 01:09
Unreserving exponents(these exponents haven't been done) jasong Marin's Mersenne-aries 7 2006-12-22 21:59
A distributed-computing project to optimize GIMPS FFT? Genetic algorithms GP2 Software 10 2003-12-09 20:41

All times are UTC. The time now is 20:19.


Mon May 23 20:19:19 UTC 2022 up 39 days, 18:20, 1 user, load averages: 1.69, 1.71, 1.68

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔