20150124, 11:27  #1 
Dec 2012
2·139 Posts 
How many bits does/did the server trial factor to?
I've long noticed that some numbers have added details and some show only the factor. I have been assuming that this is a factor that the server found. I have been unable to determine an upper bit limit to these factors, and I could not easily find an answer posted somewhere. Am I correct that the server is doing some trial factoring? If so, to how many bits, and does the bit limit depend on the size of the exponent?
Last fiddled with by Jayder on 20150124 at 11:30 Reason: You're handsome. 
20150124, 18:34  #2 
Dec 2002
3·269 Posts 
First of all, the server does not factor any exponent, the clients do and report to the server. The server hands out assignments to clients to factor exponents up from a certain bitlevel to a new higher bitlevel. Depending on the size of the exponent a standard bitlevel is the maximum, higher exponents have higher upper limits. However, clients can be manually overridden to factor to a higher or lower bitlevel before a LL test is performed. Other clients may then be manually instructed to increase the bitlevel (even) further.
Some factors have been found using non standard methods and have been added to the database manually. These factors can be extremely high. Last fiddled with by tha on 20150124 at 18:38 
20150124, 21:45  #3 
Dec 2012
2·139 Posts 
Alright. Thank you. I thought I had read differently a long time ago, but perhaps not. I've also noticed that most numbers lack a, "No factor from 2^1 to 2^x," line and only have record of TF started at a higher bit level. Here it appears that TF was started at 63 bits. This is also what prompted my question. If the server didn't do that lower range, how do we know that it was done at all? Was there an earlier effort to do this work? What was the extent of it?

20150124, 22:33  #4 
"Curtis"
Feb 2005
Riverside, CA
5^{2}×11×17 Posts 
Mersenne factors have the form 2kp+1, where p is the exponent. Consider a candidate in the 60M range. What is the bit level of the smallest possible factor (k = 1)? Note this is also why the same bit depth on a higher exponent goes slightly faster less k to check per bit.
I believe the levels below what you notice were done manually, and thus not recorded by the server. However, the TJAOI thread indicates there were severe problems with the completeness of that effort, so refactoring is underway. 
20150124, 22:46  #5  
May 2013
East. Always East.
11·157 Posts 
Quote:
Still, I know exactly what you mean with the no records of older TF work. It would have all been taken care of a while ago, but the volume of results would have been so massive for its time that, as far as I can tell, all "no factors" results were simply never submitted and it was assumed that the organized effort hit everything up to whatever point, where the ranges became nontrivial to do and results slowed down. 

20150125, 00:21  #6 
Dec 2012
2·139 Posts 
Thank you for your responses. I know that factors take the form of 2kp+1, and that you don't really start at 2^1. What I meant to imply was that the recorded starting bit levels are lower than they should be, which they are. But thank you for your explanations nonetheless.
Yep, the TJAOI thread inspired these questions. In any case, I've learned a bit. If there are no details, the factor was put into the database another way and/or is incredibly ancient. I guess TJAOI will be up to 2^64 in a few hundred years or so. 
20150125, 03:29  #7 
Romulan Interpreter
Jun 2011
Thailand
2^{3}×19×61 Posts 
That's partially true. The list of exponents started with all the primes, but before they were "handled to the server, many exponents were "eliminated" from the list, due to different results/theorems concerning mersenne factors (like for example, if p is prime and 3 (mod 4), and q=2p+1 is prime, then q  Mp, etc). Then a prescan of 2*k*p+1 was done with k going to about 40000 (a little more than 15 bits). What survived from the list of primes was handled to the server to be handled out for TF. We can say that the server prefactored, even if technically the prefactoring was done "before". If for example you find a NEW factor of 42 bits for one exponent of (say) 27 bits, with no smaller factor known, then you could say it was a "server mistake" (practically of who did the list) because a 27 bit exponent (somewhere around 135M range) should have all factors 2kp+1 under 43 bits (1+15+27, the 15 is from k) already prefactored. This did NOT happen until now. All the "new" factors are in 53+ bits, clearly "client mistakes" (either the tools, P95, etc, the user is cheating, hardware errors, database lost, etc).

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Trial Factor Assignment Time Limits  Judge Hale  Information & Answers  12  20150711 23:48 
Trial Factor Bit Depth  lavalamp  Operation Billion Digits  8  20100802 18:49 
trial division over a factor base  Peter Hackman  Factoring  7  20091026 18:27 
P95 Trial Factor speeds 40M vs 100M  harlee  Software  3  20061015 04:38 
Shortest time to complete a 2^67 trial factor (no factor)  dsouza123  Software  12  20030821 18:38 