mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Software

Reply
 
Thread Tools
Old 2005-08-20, 22:16   #1
Unregistered
 

2×3×547 Posts
Default Quick p-1 question

I have started p-1 factoring a large exponent and have a quick question. Does changing the memory allocation midway through stage 2 of the test harm it badly?
I did the whole of stage 1 (52 hours) with a large amount of memory allocated (416MB) without a problem (no factor found), but found in stage 2 that the machine was a little overloaded (no pagefile swapping but sluggish at times) so reduced it to 384MB during the first 0.5% of the test.
The amounts of memory are near (probably above) 'desirable' in proportion to the values in the documentation, but I worry that the test is totally ruined by the stage 2 limits being different to stage 1, thereby causing a problem with the precomputed values that come from the stage 1 residues.
I don't think there's anything to worry about as the memory allocation is high, and the software will probably just take care of the different limits, but thought I'd just check to see.
I found a couple of helpful posts describing the p-1 test, so I've put these on in case they're of use.

http://www.mersenneforum.org/showthread.php?t=1084
http://www.mersenneforum.org/showthread.php?t=4168
  Reply With Quote
Old 2005-08-21, 08:53   #2
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·641 Posts
Default

It's okay! You can relax.

Quote:
Originally Posted by Unregistered
Does changing the memory allocation midway through stage 2 of the test harm it badly?
It only causes the P-1 to restart stage 2 with the new allocation. It has no effect on stage 1, except maybe if you changed allocation during the short GCD phase of stage 1 (but that's only the last couple of minutes of stage 1).

Quote:
I did the whole of stage 1 (52 hours) with a large amount of memory allocated (416MB) without a problem (no factor found),
... (but stage 1 never used that 416MB allocation except maybe during the GCD for a few minutes)

Last fiddled with by cheesehead on 2005-08-21 at 09:05
cheesehead is offline   Reply With Quote
Old 2005-08-21, 09:38   #3
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts
Default

(continued)

Quote:
Originally Posted by Unregistered
but found in stage 2 that the machine was a little overloaded (no pagefile swapping but sluggish at times) so reduced it to 384MB during the first 0.5% of the test.
... so all you lost was 0.5% of stage 2, but none of stage 1. Your P-1 restarted stage 2 when you changed the allocation, but it didn't have to re-do stage 1 because stage 1 doesn't use that allocation.

Quote:
but I worry that the test is totally ruined by the stage 2 limits being different to stage 1, thereby causing a problem with the precomputed values that come from the stage 1 residues.
Relax.

Ease your mind.

The "precomputed values that come from the stage 1 residues" are (pre)computed in stage 2, not during stage 1. The stage 1 residues are computed during stage 1, but "the precomputed values that come from" those residues are (pre)computed only after stage 2 starts. The "pre" refers to their being computed at the beginning of stage 2 prior to their frequent use during the rest of stage 2, not to their having been computed during stage 1 (which they aren't).

Nothing got ruined.

Let go of your worries.

Ommmmm ...

Buried within that second thread, in ewmayer's first paragraph, is "Note that stage 1 of p-1 needs little more memory than an LL test ... ." But I don't blame you for missing that among all the other verbiage (and some posters other than ewmayer made some inaccurate remarks in that thread -- lots of folks get confused about the different parts of P-1 processing, which is more complicated than the L-L test).
cheesehead is offline   Reply With Quote
Old 2005-08-21, 18:05   #4
Unregistered
 

3×5×232 Posts
Default

Thanks for taking the time to answer. When I changed the memory allocation, (from what I remember) the percentage complete of stage 2 did not start again from 0 as would be expected if it had restarted, but just continued on from that percentage. This is the reason why I asked because if it didn't restart then it WOULD still be using the original precomputed values (at the beginning of stage 2) and there may yet be a problem. There are a few possibilities:

The test restarted without saying so.
The test restarted and I didn't see it!
The test did not restart and there may be, (but probably isn't) a problem!

Also, when I provided info about memories used there was no misunderstanding on my part about when it used the memory, they were just there as a guide to the magnitude of change (and therefore the limits of the test) to assist any prospective helper. Maybe I should have said 'ruined stage 2' previously.
This is all probably academic as the software most likely sweeps up the changes, but I'm interested in case there is a problem. I cannot test at the moment whether stage 2 does indeed restart or not, so will have to wait and see.
Thanks again
  Reply With Quote
Old 2005-08-21, 18:46   #5
akruppa
 
akruppa's Avatar
 
"Nancy"
Aug 2002
Alexandria

246710 Posts
Default

The residue at the end of stage 1 is completely unaffected by the memory settings for stage 2, and it is the only "precomputed" value stage 2 needs as input. Stage 2 then precomputes a few tables of data and starts processing all primes between B1 and B2. How efficiently it can do that depends on how large those tables of data are, i.e. how much memory you gave it. If you interrupt and then restart, it will compute its tables again (maybe differently than before) and resume where it was. Importantly, being able to finish the remaining range does not require having the same tables as the first time. If you give it less memory the second time around, it will do the remaining range a little slower, but it gets done just the same.

Alex
akruppa is offline   Reply With Quote
Old 2005-08-22, 00:09   #6
Unregistered
 

11110100101102 Posts
Default

Thanks, I think this pretty much covers everything now, and I have fingers crossed for finding a factor!!!
The curious thing for me about p-1 is that the time to do stage 2 is constant whether or not it has a lot of memory allocated to it - only the percentage of finding a factor changes with memory! This seems a little weird, and especially so on large jobs, but I suppose this is related to the Maths of the method.
  Reply With Quote
Old 2005-08-22, 14:53   #7
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·641 Posts
Default

Quote:
Originally Posted by Unregistered
The curious thing for me about p-1 is that the time to do stage 2 is constant whether or not it has a lot of memory allocated to it - only the percentage of finding a factor changes with memory! This seems a little weird, and especially so on large jobs, but I suppose this is related to the Maths of the method.
Well, I did write "more complicated" above.

When you're doing P-1 as a automatic part of an L-L test assignment or by use of the Pfactor= command in worktodo.ini, the program selects B1 and B2 bounds by a method designed to optimize GIMPS throughput. The more memory you have allocated, the higher the B2 limit it chooses (and the higher its chance of finding a factor). But since the extra memory is used to make it run faster, the total time needed to reach the higher limit may not be noticeably longer than the total time needed when less memory is allocated and the B2 limit is lower.

( Since I feel "picky", I'll add that this is due to the maths of GIMPS's implementation of the P-1 method, rather than the maths of the basic P-1 method itself. )

Last fiddled with by cheesehead on 2005-08-22 at 15:01
cheesehead is offline   Reply With Quote
Old 2005-09-22, 23:52   #8
wackyeh
 
wackyeh's Avatar
 
Feb 2003

19 Posts
Default

George even states in the Prime95's README:

Quote:
SETTING AVAILABLE MEMORY
------------------------

The P-1 factoring step prior to running a Lucas-Lehmer test is more
effective if it is given more memory to work with. However, if you let
the program use too much memory then the performance of ALL programs will
suffer.
Quote:
So how do you intelligently choose the available memory settings? Below
are some steps you might take to figure this out:

1) Be conservative. It is better to set the available memory too low
than too high. Setting the value too high can cause thrashing which
slows down all programs. Remember, the program will only use the
extra memory in stage 2 of P-1 factoring.
From my own experience, using any more than about [desirable] ~5.5x the exponent (say 220MB for a exponent around 40M), it can significantly increse the amount of time to do the test. For example, 3000 iterations on a 42M exponent went from 500+ secs to around 450 secs. simply by reducing memory from 320MB used to 256MB used (42 x 5.5 = 231MB)

The change in memory used won't affect the current B1 and B2 limits once the test has started (even if still in stage 1), except maybe prevent stage 2 from running. I think (but can't be sure) the limits are saved as part of the save file. Using less then the desirable memory usage can significantly change the limits that are chosen (even prevent stage 2 if too low), but using more -- even a LOT more -- doesn't seem to have much of an impact. As I recall, even doubling the desirable memory caused only a minor change in limits (like B1=600,000 to B1=598,000 and B2=16,000,000 to B2=16,200,000 or something like that).

As far as changing the amount of memory used in the middle of a test, the only thing that will happen is that the test is stopped and re-started (at the same point it stopped), using the new memory allocation. It won't change the limits, just the amount of memory used to compute tables and perform calculations, etc. All the extra memory allocated though, is only used in stage 2. Stage 1 of P-1 doesn't use much more than the L-L test would. Even right now according to Windows, the L-L test of a 42M exponent currently executing is using about 32MB (which includes Prime95 itself plus the memory for the L-L test)-- less than 40% of the minimum memory required to do a stage 2 P-1 on it.
wackyeh is offline   Reply With Quote
Old 2006-10-13, 23:35   #9
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·641 Posts
Default

wackeyh,

I didn't see your posting until today, or I'd have responded sooner.

Some of what you write may be correct as far as it goes, but your failure to mention an important consideration, the probability of finding a factor in P-1, invalidates your conclusions.

Changing the B1/B2 limits or Available Memory changes not only the running time, but also the chances of finding a factor. Higher limits = higher chance of finding a factor.

Prime95's algorithm for choosing B1/B2 takes several factors into account, including probability of finding a factor, and its choice maximizes the overall benefit to GIMPS throughput! When it chooses higher limits that take a longer running time after you increase Available Memory, it has calculated that the longer running time is justified by the higher chance of finding a factor. So cutting down your Available Memory does no favor for GIMPS, unless you do so only to avoid swapping, not just for minimizing P-1 running time.

Also, I've posted on this subject at http://www.mersenneforum.org/showpos...03&postcount=2
cheesehead is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
A quick question Pegos Information & Answers 6 2016-08-11 14:39
Quick question about gmp-ecm usage Dubslow GMP-ECM 37 2016-07-27 07:37
Quick Question about assignments Dubslow PrimeNet 291 2011-11-17 11:50
Quick TF Question Dubslow GPU Computing 2 2011-10-27 04:49
Quick AffinityScramble question Smorg Hardware 0 2009-11-17 20:38

All times are UTC. The time now is 11:33.

Sat Dec 5 11:33:05 UTC 2020 up 2 days, 7:44, 0 users, load averages: 1.65, 1.52, 1.47

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.