[QUOTE=chalsall;484387]Keep in mind that when you're redoing P1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.
[/QUOTE] Sure thing! No need to split hairs, since this is a filler job for me, but  using mersenne.ca's p1 prob estimator  those cands had former B1/B2's corresponding to a prob level around 1.5%, and I brought them around 4%  hence my disappointment. As a more general point, I've been playing this "filler game" for quite some time now, and I've as well experienced an average of 34% of successes on average. For sure, when TF levels were lower, 'twas much more rewarding :) 
[QUOTE=ric;484392]No need to split hairs, since this is a filler job for me, but  using mersenne.ca's p1 prob estimator  those cands had former B1/B2's corresponding to a prob level around 1.5%, and I brought them around 4%  hence my disappointment.[/QUOTE]
Ah... I now better understand your statement. Statistics has no memory. Run an infinite number of tests and you should see about a 2.5% success rate... :smile: 
[QUOTE=petrw1;483122]As you may have noticed I am on a P1 binge recently.
I have full time P1 running on: 2 x 2core PC's 6 x 4core PC's Some have as much as 16GB of RAM; a couple only 4G. I have noticed over the past years of doing P1 that more RAM makes a big difference in Phase 2. Simple observations have shown that running 480 relative primes in batch of 10 takes noticeably longer than that same run in batches of 120 for example. (I wouldn't be surprised if the following has been noted before and I missed it)... So that got me to thinking that, especially for the PC's with 4GB or 8GB of RAM it should complete more total tests per week if I ran 2 workers of 2 cores each rather than 4 workers with 1 core each. The phase 1 may be slightly slower but the phase 2 should be enough faster that it more than makes up for it; faster because with only 2 workers fighting for RAM they each will get a lot more and can process more relative primes. Opinions?[/QUOTE] So after a month of trying P1 with 2 workers of 2 cores each on a PC with 4.5G available to Prime95 I measured that the actual thruput has dropped by about 5%....I'm surprised but facts don't lie. But then I got to thinking that when I have 2 workers instead of 4; not only do they each get twice the RAM per Stage 2...but (and I know almost nothing about this) ... I believe that at some point of extra RAM for Stage 2 Brent Suyama kicks in and my E=3 or E=6 or E=12 increases the odds of finding a factor. Am I anywhere close on this...and can someone tell me how much of an increase the E=3 or E=6 or E=12 gives me? Maybe enough that I can swallow the 5% thruput loss. Thanks 
[QUOTE=chalsall;484387]Keep in mind that when you're redoing P1 work you should expect a lower probability than what Prime95/mprime reports. As a rough guide, I subtract from what is reported as expected by what the previous run's probability was.
When I'm redoing poorly P1'ed work (read: no Stage 2) I tell mprime that the test will save four LL tests, and give it between 10GB and 12GB of RAM to use. Doesn't make sense, I know, but it's my kit and electrons. On average in the 4xM range I get about a 3% success rate.[/QUOTE] I have 28 cores doing full time P1; all are redo's in the 5xM range where the prior P1 was Stage 1 only (B1=B2). Subtracting the expected factor ratio of the prior run from my run comes out close to 3%. After more than 5,100 of such tests my overall success rate is currently 2.98% In the extremes I once saw consecutive Factors and six times 2 factors out of 3 attempts (on the low end) AND 169 and 183 attempts between factors (on the high end). My perPC success rates range from 2.12% to 3.39% among PCs that have done more than 600. 
I am now "reserving" P1 work in the following ranges:
48.6 48.7 46.7 48.3 47.2 
I'm redoing P1 on exponents in the [20m, 20.05m] range where B2 < 2,500,000.

Continuing P1 where B1=B2 in all 4xM ranges that have more than 2000 unfactored

I have another PC running P1 on exponents in the [57m, 57.1m] range.

I'm running P1 with larger bounds on exponents in the 2.6M and 2.8M ranges.

I'm currently redoing P1 on exponents from 14.4m to 14.5m with B2 < 2,000,000.

[QUOTE=masser;526678]I'm running P1 with larger bounds on exponents in the 2.6M and 2.8M ranges.[/QUOTE]
I will pause on the 2.8M range and resume on the 2.6M range. I'm losing a few credits here and there because a few people (probably not on the forum) are also taking exponents in these ranges for P1 with large bounds. After I complete my first pass over 2.6M, I plan to do a second pass over 2.6M and 2.8M. Anyone have a suggestion for how much to increase the bounds for a second pass of P1 factoring? I was thinking that I would use the probability calculator at mersenne.ca; currently I'm putting in about 0.5 GhzD/exponent. For the second pass, I thought I would increase that to 0.75 GhzD/exponent and use bounds that approx. maximize the probability of finding a factor. 
All times are UTC. The time now is 10:29. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.