Quote:
Originally Posted by KEP
In most cases it most likely will. However, there is for the SG conjecture, only 32 k's remaining. So we would sieve a lot of unnecessary k's and we would have to repeat the testing 100,000,000 times to cover all n. I have no means to test if handeling 100M n files and combining the candidates remaining into 1 sievefile, from wich unneeded k's is removed, is actually faster than sieving all 32 k's using srsieve (tried srsieve2, but it crashed when switching to generic sieve).

Send me the file that crashes srsieve2.
I'm trying to understand your goal. Are you trying to find the smallest n that yields an SG prime for each k? I was thinking of the search over at PrimeGrid where they are searching for SG primes with a specific bit length (or small range of them).
gfndsieve is the best sieve for b=2 and c=+1 as it sieves for a range of n and k. One could let it sieve to some prime to eliminate SG candidates because k*2*n+1 has a factor. One could then manipulate the output file with a script to convert k and n (for 2*(k*2^n+1)1) then rerun gfndsieve starting at p=3. When done, convert k and n back.