View Single Post
Old 2018-07-28, 06:56   #6
axn
 
axn's Avatar
 
Jun 2003

22×32×131 Posts
Default

Quote:
Originally Posted by clowns789 View Post
Interestingly, according to the following link, factoring to 91 bits would require over 150K GHz-days of computation, while a LL test requires only 91K:

http://www.mersenne.ca/exponent/3321930371

Perhaps I'm misreading it, but that seems to imply that 91 would be too high, or one of the estimates is off, perhaps due to it being so far outside of normal assigned ranges.
The figure of 91K is wrong. I think it is closer to 600K. James's site doesn't have P95 timing data for a FFT big enough to handle that exponent, and so uses timing from a smaller FFT, hence the discrepancy. If 87 bits is good enough for CPU TF, then 91 bits is correct for GPU TF.

EDIT:- Compare the LL GH-Days for an exponent 1/10th the size: http://www.mersenne.ca/exponent/332193019. An exponent 10 times the size should be at least 100 times the effort, so 600K might actually be a conservative estimate.

Last fiddled with by axn on 2018-07-28 at 06:58
axn is offline   Reply With Quote