Quote:
Originally Posted by clowns789
Interestingly, according to the following link, factoring to 91 bits would require over 150K GHzdays of computation, while a LL test requires only 91K:
http://www.mersenne.ca/exponent/3321930371
Perhaps I'm misreading it, but that seems to imply that 91 would be too high, or one of the estimates is off, perhaps due to it being so far outside of normal assigned ranges.

The figure of 91K is wrong. I think it is closer to 600K. James's site doesn't have P95 timing data for a FFT big enough to handle that exponent, and so uses timing from a smaller FFT, hence the discrepancy. If 87 bits is good enough for CPU TF, then 91 bits is correct for GPU TF.
EDIT: Compare the LL GHDays for an exponent 1/10th the size:
http://www.mersenne.ca/exponent/332193019. An exponent 10 times the size should be
at least 100 times the effort, so 600K might actually be a conservative estimate.