mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > PrimeNet > GPU to 72

Reply
 
Thread Tools
Old 2012-01-26, 16:11   #34
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

27AE16 Posts
Default

Quote:
Originally Posted by kladner View Post
Thanks. Yes, indeed I am already seeing "Looking for work that uses less memory." At the current rate of backlog it seems that catch-up sessions will have to be allowed every few days to a week.
While it is falling behind with 2 out of 4 as high mem workers, it seemed to keep up with 3 of 4 allowed.
kladner is offline   Reply With Quote
Old 2012-01-26, 16:21   #35
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

10000100101002 Posts
Default

Quote:
Originally Posted by kladner View Post
While it is falling behind with 2 out of 4 as high mem workers, it seemed to keep up with 3 of 4 allowed.
Yes. On one system I have 3of4, on the other I have 2of3, and both setups allow stage2 to keep up. Of course, if I have some extended sessions with Photoshop or something else that is on the LowMemWhileRunning list, it takes a little longer to catch up the backlog.
James Heinrich is offline   Reply With Quote
Old 2012-01-26, 18:52   #36
Dubslow
Basketry That Evening!
 
Dubslow's Avatar
 
"Bunslow the Bold"
Jun 2011
40<A<43 -89<O<-88

3·29·83 Posts
Default

I think the necessary fraction is equal to how S2 time/(S1+S2) time, but of course that is trivially obvious. So, James you might know better, but somewhere around 60% is the key number (noting that 2/3~=67%, and 3/4=75%).
Dubslow is offline   Reply With Quote
Old 2012-01-26, 21:29   #37
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

236568 Posts
Default

Quote:
Originally Posted by Dubslow View Post
How do you get 900 relative primes then? What's your B2/B1 ratio?
I am running about a 20.25 ratio in the 45-55M range. Jerry posted an example for a 50M exponent (58967MB) which had a 22.75 ratio. James Heinrich posted an example of a 52M assignment (9497MB) with a 20.75 ratio.

On the other hand, Jerry's 400M run (20779MB) is showing a 32.75 ratio.

I can't really draw much conclusion from this, except that the ratio goes up substantially with huge exponents and tons of RAM. (Well duh!)

I just went ahead with upping my RAM to 16GB. I'm looking forward to seeing how that effects things. Early results do show that 192 RPs are being done in a single pass. I now have 8GB for day, and 12GB for night. I'll probably tweak that when I see how other high memory apps (Photoshop) perform.
kladner is offline   Reply With Quote
Old 2012-01-26, 21:58   #38
firejuggler
 
firejuggler's Avatar
 
"Vincent"
Apr 2010
Over the rainbow

5·11·53 Posts
Default

Code:
[Jan 26 18:09] Setting affinity to run worker on any logical CPU.
[Jan 26 18:09] Optimal P-1 factoring of M55039799 using up to 5700MB of memory.
[Jan 26 18:09] Assuming no factors below 2^71 and 2 primality tests saved if a factor is found.
[Jan 26 18:09] Optimal bounds are B1=580000, B2=13340000
[Jan 26 18:09] Chance of finding a factor is an estimated 4.74%
[Jan 26 18:09] Setting affinity to run helper thread 1 on any logical CPU.
[Jan 26 18:09] Using Core2 type-2 FFT length 2880K, Pass1=640, Pass2=4608, 3 threads
[Jan 26 18:09] Setting affinity to run helper thread 2 on any logical CPU.
[Jan 26 18:09] Using 5697MB of memory.  Processing 240 relative primes (147 of 432 already processed).
firejuggler is offline   Reply With Quote
Old 2012-01-26, 22:02   #39
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

2×3×1,693 Posts
Default

Thanks for the info, firejuggler!

On the first new assignment started with the new allocation (another HighMemWorkers exceeded case), the bounds ratio went up to 23. It's just under a 52M exponent. The bounds are B1=545000 and B2=12535000.

Last fiddled with by kladner on 2012-01-26 at 22:16
kladner is offline   Reply With Quote
Old 2012-01-26, 22:21   #40
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

22×1,061 Posts
Default

Quote:
Originally Posted by kladner View Post
I can't really draw much conclusion from this, except that the ratio goes up substantially with huge exponents and tons of RAM. (Well duh!)
More available RAM will tend towards slightly higher ratios.
Not taking into account efficiencies of more RAM, the most efficient (probability per effort) B1/B2 ratios tend to be around 20x-22x, as noted. However if you have a specific B1 and/or B2 that you want to target for whatever reason, the optimal ratios may vary.

Quote:
Originally Posted by kladner View Post
I now have 8GB for day, and 12GB for night. I'll probably tweak that when I see how other high memory apps (Photoshop) perform.
I submit it would be better to keep a more constant RAM amount between day and night and keep both Prime95 and Photoshop efficient with LowMemWhileRunning=photoshop in prime.txt
If you have day=8/night=12, the bounds will be calculated assuming you have 12GB available, whereas in fact you may be running stage2 a large part of the time with 8/3=2.67GB per worker. If Prime95 knew that (it maybe should, but doesn't) it would've picked smaller, faster bounds to maintain the balance of efficiency. It would probably be best to put Memory=3500 under each worker section of local.txt -- this way Prime95 knows that it will have 3.5GB available for stage2, not think it "might" have 12GB, and it will calculate bounds accordingly. Remember that Prime95's bounds calculation is still based on the assumption that P-1 is a rare thing that happens before L-L, not a regular worktype.
James Heinrich is offline   Reply With Quote
Old 2012-01-26, 22:40   #41
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

1015810 Posts
Default

Quote:
Originally Posted by James Heinrich View Post

I submit it would be better to keep a more constant RAM amount between day and night and keep both Prime95 and Photoshop efficient with LowMemWhileRunning=photoshop in prime.txt
If you have day=8/night=12, the bounds will be calculated assuming you have 12GB available, whereas in fact you may be running stage2 a large part of the time with 8/3=2.67GB per worker. If Prime95 knew that (it maybe should, but doesn't) it would've picked smaller, faster bounds to maintain the balance of efficiency. It would probably be best to put Memory=3500 under each worker section of local.txt -- this way Prime95 knows that it will have 3.5GB available for stage2, not think it "might" have 12GB, and it will calculate bounds accordingly. Remember that Prime95's bounds calculation is still based on the assumption that P-1 is a rare thing that happens before L-L, not a regular worktype.
This recommendation is one of the things I was looking for! The explanation is also greatly appreciated. I will put this into play at once.


Thanks VERY much James!

EDIT: A question: with the 3500 per worker, would I make the overall allocation equal to that amount times the number of workers allowed? Or does the LowMemWhileRunning=photoshop take care of that, so that I should set the overall to 14,000?

Thanks again for providing the fruits of your experience, and for your patience.

Last fiddled with by kladner on 2012-01-26 at 22:52
kladner is offline   Reply With Quote
Old 2012-01-26, 23:13   #42
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

22×1,061 Posts
Default

Quote:
Originally Posted by kladner View Post
with the 3500 per worker, would I make the overall allocation equal to that amount times the number of workers allowed?
Shouldn't really matter, but set it to 10500 (3*3500).
James Heinrich is offline   Reply With Quote
Old 2012-01-26, 23:27   #43
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

2·3·1,693 Posts
Default

Thanks!
kladner is offline   Reply With Quote
Old 2012-01-27, 20:43   #44
kladner
 
kladner's Avatar
 
"Kieren"
Jul 2011
In My Own Galaxy!

27AE16 Posts
Default

Quote:
with the 3500 per worker, would I make the overall allocation equal to that amount times the number of workers allowed?
Quote:
Originally Posted by James Heinrich View Post
Shouldn't really matter, but set it to 10500 (3*3500).
It does seem that individual allocations preempt the global setting. I have 12000MB set in P95, but 5000MB set for each worker. When I allowed 3 HighMemWorkers, each took near to 5000. I've got it set back to 2 workers now.

I also just saw the first E=12 come up on a 52M assignment with these settings.

Last fiddled with by kladner on 2012-01-27 at 20:44
kladner is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Discussion about dates Flatlander Twin Prime Search 12 2011-11-17 09:40
10,375- LA discussion Raman Cunningham Tables 27 2008-12-04 21:17
P-1 discussion AntonVrba Prime Cullen Prime 5 2007-04-04 04:59
factexcl.txt discussion hhh Prime Sierpinski Project 5 2006-11-22 17:50
New .dat discussion VJS Prime Sierpinski Project 7 2006-07-25 14:31

All times are UTC. The time now is 23:12.


Tue Jun 6 23:12:48 UTC 2023 up 292 days, 20:41, 0 users, load averages: 1.09, 0.97, 0.93

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔