20110412, 22:16  #56 
"Mark"
Apr 2003
Between here and the
3·31·71 Posts 
I have some questions and I'm certain that Gary or one of the other lurkers here can answer.
First, given a weight of one of these single k conjectures and a range of n, is there a formula that could tell one how deep that range needs to be sieved? I'm not looking for an exact formula, but I suspect that generating an estimate shouldn't be too hard. Second, given a weight of one of these single k conjectures and and a range of n, how does one estimate how many n will be left after sieving? By combining the two answers I can then estimate (given my hardware) how long it would take for me to sieve and PRP test a range of arbitrary size (presuming that no prime would be found). 
20110412, 22:30  #57 
Account Deleted
"Tim Sorbera"
Aug 2006
San Antonio, TX USA
10B5_{16} Posts 
Some more exact answers would be great, but here's what I've got:
I can't tell you how it is calculated, but if you run sr2sieve, it will tell you how many factors to expect in the range you tell it to search. Subtract some portion (I'm sure an exact formula exists and is not too complicated  don't have time ATM to figure it out) to account for multiple factors on one candidate, and then subtract that from the total, and you've got your estimate candidates remaining. This requires some amount of presieving, not just knowing the weight. 
20110412, 22:43  #58  
"Mark"
Apr 2003
Between here and the
6603_{10} Posts 
Quote:


20110413, 00:30  #59 
May 2007
Kansas; USA
2·41·131 Posts 
Those are very tough questions and there is no exact answer. Personally I am very adamant about not sieving a range that would take more than 12 years to fully test because sieving and testing software speeds continue to increase so quickly. For that reason, at these levels for a "personal project", I would generally sieve a 2X or 2.5X range. That is where the high nvalue is 2 to 2.5 times the low nvalue. When doing that, you can test a candidate at ~60% of the nrange and then sieve until the factor removal rate is approximately the length of the test.
If you would like to sieve a range that gives about a 50% chance of prime, that can't very accurately be calculated by weight alone. I'd suggest sieving using srsieve, perhaps, a range of n=100K500K to P=100M or 1G. Then plug everything into the "odds of prime" spreadsheet, which includes the # of candidates remaining. If the chances of prime are < 50%, then sieve a larger range. If > 50%, then sieve a smaller range. As for how deep to sieve it, that depends on nmax/nmin. If > 2.5X, then pieces need to be broken off and tested with sieving continuing on the remainder. If <= 2.5X, then refer to the first para. Gary 
20110413, 00:55  #60  
"Mark"
Apr 2003
Between here and the
14713_{8} Posts 
Quote:


20110413, 01:59  #61 
"Mark"
Apr 2003
Between here and the
1100111001011_{2} Posts 
I have one more question, which I presume has an easy answer. I know that doubling the sieve depth reduces the number of candidates by about 30%. My related question is how does doubling the sieve depth impact removal rate? It is clearly not reduced by 30%. I would guess that it would be around 70%, i.e. if the removal rate is around 1 per second at 1e9, then it would be around 1.7 per second at 2e9. Does that sound right?

20110413, 05:53  #62  
May 2007
Kansas; USA
2×41×131 Posts 
Quote:
On the second statement, you have it reversed but I think I know what you mean. You're saying that if at a sieve depth of 1e9, the removal rate is 1 every 1 second, then at 2e9, the removal rate would be 1 every 1.7 secs (vs. 1.7 per 1 sec). Anyway, no, it's almost a straight ratio. When you double the sieve depth, you halve the removal rate. When you triple the depth, you divide the removal rate by 3. Why? Because each candidate has half (or 1/3rd) as much chance of having a specific prime factor that is being tested. So at 2e9, it would be 1 every 2 secs. Note that it would be an EXACT ratio of 2 to 1 or 3 to 1 if you left all candidates in the file to be sieved even after a factor was found for them. But since the programs do not continue to find factors for candidates that already have a factor (If not stopped midstream, sr2sieve knows to remove them in memory and not search them anymore.), then such ratios are more like 2.1 or 2.2 to 1 or 3.1 or 3.2 to 1. I have one more thing to add to a statement that I previously made: Quote:
Gary Last fiddled with by gd_barnes on 20110413 at 06:22 

20110413, 14:43  #63  
Jun 2009
1270_{8} Posts 
Quote:


20110413, 19:10  #64 
May 2007
Kansas; USA
10100111110110_{2} Posts 
Yes, because in addition to removing factors from n=100K250K, you are also removing factors from n=250K500K at the same time, which saves much time in the future and is quite efficient since the speed of sr(x)sieve scales approximately with the square root of the nrange. It's (almost) the fastest way to sieve and test the entire n=100K500K range, regardless of whether a prime is found early or not. The hard part is determining what is the optimum range to sieve in the first place based on the fact that we stop testing when finding a prime. If you sieve too large of a range, there is a great chance of finding a prime before you are done resulting in a lot of wasted sieving. If you sieve too small of a range, you lose a lot of sieving efficiency because of the speed scaling mentioned above. I generally recommend sieving an nmax/nmin ratio of 2X to 4X. You could go higher if the k or k's are extremely low weight but I'd suggest almost never going lower.
Last fiddled with by gd_barnes on 20110413 at 19:14 
20120110, 17:21  #65 
"Mark"
Apr 2003
Between here and the
3×31×71 Posts 
Should the weights listed in the first post account for the removal of algebraic factorizations? I know that they don't current do so. For example S182 has a weight of 389, but if one removes n where 8*182^n+1 can be factored algebraically, that is reduced to 120. My point is that this conjecture is much harder to prove since so many n are removed for that reason.
These are the updated numbers: Code:
8*86^n+1 1017 571 32*87^n+1 342 257 8*182^n+1 389 120 27*220^n+1 1136 0 27*252^n+1 2164 1312 8*263^n+1 363 211 27*328^n+1 870 489 8*353^n+1 613 0 8*426^n+1 1288 486 8*428^n+1 655 258 8*497^n+1 943 570 9*724^n+1 1573 1573 8*758^n+1 549 281 8*785^n+1 588 178 8*828^n+1 1136 607 8*866^n+1 440 0 8*930^n+1 1645 1107 8*953^n+1 1155 757 8*983^n+1 853 425 4*72^n1 1211 838 64*177^n1 1607 1307 32*221^n1 1317 988 4*275^n1 1472 1472 8*321^n1 1017 606 8*328^n1 915 522 16*333^n1 1737 1389 32*402^n1 1126 842 64*425^n1 1607 1607 9*636^n1 2840 1758 4*650^n1 1122 1122 8*665^n1 1582 610 9*688^n1 1252 641 32*702^n1 2339 1780 64*741^n1 2429 2429 8*761^n1 1527 1142 4*812^n1 1052 1052 8*815^n1 1988 984 8*836^n1 1446 768 221*850^n1 1414 1414 8*866^n1 1595 803 8*867^n1 836 836 4*968^n1 938 938 Could somebody take the time to verify my synopsis? If it is verified, then we have a problem. It is possible that the Sierpinski/Riesel number for other unproven conjectures is wrong. If I'm wrong, then I must have a bug in my release of srsieve. 
20120110, 17:32  #66 
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
2^{3}×1,229 Posts 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
P1 factoring attempts at smallestremaining Mersenne numbers with no known factors  UberNumberGeek  Factoring  51  20170213 20:30 
20 Easy Pieces  The Remaining 29bit Jobs  swellman  XYYXF Project  5  20160227 22:35 
Discussion about CPU time needed and k's remaining  Siemelink  Conjectures 'R Us  41  20080711 23:05 
Easiest Remaining Cunninghams  R.D. Silverman  Factoring  1  20080312 03:34 
distribution of remaining candidates  thommy  3*2^n1 Search  41  20040411 22:05 