20070214, 16:21  #1 
Nov 2003
2^{2}·5·373 Posts 
Anniversary
Hello All,
This is my 25th Anniversary of working on the Cunningham project. I wish that I had a new result to report, but I am currently lacking in CPU resources. I mailed my first factor (a p13 of 3,193 via P1) to Sam Wagstaff 25 years ago; about a week after I started working on the project. It was found on a 16bit Zilog Z8000 minicomputer. In that time we have progressed from where it was hard to factor numbers of 50 digits to where it is now hard to factor numbers of 250 digits. This comparison is a bit unfair however, because the former refers to general numbers (done in 1982 via CFRACSEA!!), while the latter refers to Cunningham numbers done with SNFS. Perhaps a better choice for upperbound would be about 170 digits (via GNFS). This improvement is due to roughly equal contributions from improved computer technology and to new algorithms. Consider, for example, that even if we had NFS available to us in 1982, we could not have implemented it on machines of that time owing primarily to space, rather than cpu limitations. It would have been a theoretical curiosity. Also, many of our improvements have come not only from improved computers, but also from improved computer availability. May the next 25 years bring a similar factor of 3 to 4 to our efforts!!!!!! It will not happen without new algorithms, but we can all hope. It's now been 18 years since our last new algorithm. It would certainly nice if we had some new talent or if a new generation of factorers came along. The difficulty with this notion is that the bar is set very high in terms of even the easiest remaining Cunningham numbers. It is hard to make a contribution with only a small number of computers. And the learning curve is very steep. For those who want to see something of the history of the project, Sam Wagstaff has put all of the old pages, and some of the cover letters at: http://homes.cerias.purdue.edu/~ssw/cun/oldp/index.html Bob 
20070214, 17:26  #2 
∂^{2}ω=0
Sep 2002
República de California
3·3,877 Posts 
Happy quartercenturyoffactoring to you, Bob!
I also find it interesting that after the multiple breakthrough factoring algorithms of the 70s (Pollards's p1, Williams' p+1, Shanks' SQFOF, Brillhart/Morrison's CFRAC) and the 80s (Pomerance's QS and its multiplepolynomial extension by yourself and Montgomery, Lenstra's ECM, Pollard's NFS), virtually all the progress that has been made since (and it has been impressive  don't get me wrong) has been in refinements of the known methods and everincreasing computational power, exploitation of parallelelism, and so forth. Perhaps the glory years of the aforementioned 2 decades spoiled us, and it is unreasonable to expect a brandnewandimproved factoring algorithm every few years or so. But one can hope. And of course the real breakthrough factoring algorithm (at least for integers "small" enough to be explicitly stored) is already known  with quantum computing we have the needed algorithm, we simply lack the needed hardware. But it would nice if at least one or two qualitatively newandbetter algorithms for classical computers were found before QCbased hardware blows the door off of everything. 
20070214, 18:15  #3  
Tribal Bullet
Oct 2004
3538_{10} Posts 
Quote:
Many happy returns on being in for the long haul. And I'm working as fast as I can :) jasonp 

20070214, 18:18  #4 
Jan 2005
Transdniestr
503 Posts 
Here's to perseverance!

20070214, 18:28  #5 
∂^{2}ω=0
Sep 2002
República de California
11631_{10} Posts 

20070214, 18:34  #6 
Jan 2005
Transdniestr
1F7_{16} Posts 
Yes, he is due some perverse severance pay if he ever retires from the project.

20070214, 19:11  #7  
Bamboozled!
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across
3^{2}·1,187 Posts 
Quote:
I don't remember when my first factor was sent in, but it must have been about 15 years ago. The bar seems to have been high since I've been contributing. There are, of course, at least two ways around the problem. I've exploited both of them. First is to contribute to or (preferably IMO) organise a collaborative effort. There have been several such over the years, including 3 incarnations of NFSNET, The Cabal, and the MullFac group. The other way is to snaffle the relatively easy factorizations when they become available. Some become available through ECM reducing a composite to a range where modest QS or NFS resources are sufficient. Others become available when the acquisition of new hardware allows the easier SNFS targets to become feasible. A case in point is that my latest main machine, a AMD643500+ has enough memory (2.5GB) to let me complete the linear algebra on the first holes, and is fast enough to let me do so in a reasonably short time. Sieving is performed on that machine and a collection of antiques, going as far back as a PPro, sundry PII and PIII, and a Sparc10. Even without the antiques, it will only take me a few months to complete a first hole. The learning curve is indeed steep if you wish to write your software ab initio. However, that's not a requirement. Several excellent implementations of ECM and/or NFS are available as source code (including one of yours, I believe) and the learning curve can be as gentle as desired. Running blackbox software in fireandforget mode is an excellent way to get started, in my opinion, and more ambitious activities can wait until resources (time, education, enthusiasm, etc) permit. 

20070215, 09:07  #8  
Aug 2003
Europe
2×97 Posts 
Quote:
Quantum computing might indeed be the next improvement we will see and until the hardware becomes available we need to stick around with the current solutions. 

20070215, 11:30  #9 
Aug 2003
Europe
2·97 Posts 
And my eye just stumbled on this article of DWave Systems:
http://www.dwavesys.com/index.php?ma...t01returnid=21 Not sure how this first Orion chip with only 16 qubits whould be interresting for factoring large numbers, but they are planning a 512 qubit system for summer next year with a possible 1024 qubit system before the end of 2008. Which would be available for rent to anyone with enough cash probably. 
20070215, 12:40  #10  
Bamboozled!
"𒉺𒌌𒇷𒆷𒀭"
May 2003
Down not across
3^{2}×1,187 Posts 
Quote:
It could factor numbers as large as 127, assuming I remember properly how these things work. That is, a 2n+1 qubit computer is required to factor a nbit integer. A 1024qubit computer is where it starts to get interesting, as far as integer factorization is concerned. Paul 

20070215, 17:22  #11 
Oct 2006
vomit_frame_pointer
2^{3}×3^{2}×5 Posts 
Yes. Also, thanks for for your theoretical contributions, as well as the time you spend plonking down interesting stuff around here.
I think Akruppa said something about the factoring itch: the compulsion to bust a composite down to primes. I have a chronic case of this, but less acute than the severe cases around here. For some reason, it isn't enough to know that some big honker is composite or not. I'm a mildlyeducated dilletante, but I check in every few years and skim a paper or two, and am astonished at how things have changed. I read about P1, CFRAC, and SQUFOF as an undergraduate. P1 seems like the biggest value for your money among these early ones, particularly since the wonderful ECM is a supercharged realization of the same idea. MPQS took advantage of the falling price of memory, using little beyond quadratic reciprocity and iterative methods for solving quadratic congruences. "Wow! You mean, that actually works?" was my first reaction. MPQS has several points at which the asymptotic behavior could be bad, but isn't, or the fixed cost of solving n equations, where n is the size of the factor base, could be really large, but isn't. It is surprising that NFS is still tops, 1718 years after it first surfaced. I hope this changes, but not too abruptly. If the QC technology makes all this obsolete, I'll be sorry to see some of these methods become relics. Factorization is both glamorous and unloved. It doesn't seem to be a tenureenhancing pursuit, perhaps because it doesn't fall within a narrow subdiscipline, but it's also in its infancy. I hope we have another wave of progress in sparse matrices or factorization of ideal classes, or maybe a departure from the ritual of seeking immense piles of smooth numbers as the cure to what ails us. The weirdest thing about recent history is that we have gone from special multiprocessor machines to run CFRAC on 65digit integers, to the point where 120digit composites are fodder for personal computers, and a 95digit job causes anxiety only because it forces a choice among three different methods. Put another way: in the early 90s, a generic 70digit integer was near the frontier of what a single PC could do. These days we routinely pull factors of 70 digits or more from large composites. When I think about how much worse a 70digit job is than a 65digit job, this is shocking progress. Last fiddled with by FactorEyes on 20070215 at 17:26 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Ramanujan's 125 birth anniversary  garo  Lounge  1  20111230 23:57 
TPS Anniversary Rally: April 1318  Oddball  Twin Prime Search  9  20110421 19:44 
Happy 50th Anniversary, Yuri Gagarin  ewmayer  Science & Technology  2  20110414 01:19 
Anniversary.  Xyzzy  Lounge  4  20070827 11:42 