20180531, 22:44  #1574 
May 2009
Russia, Moscow
2593_{10} Posts 
I've cracked ~400 C100's out of 2100 by ECM to t25. And I confirm that almost all composites beginning with 1 or 2 are ECM resistant.

20180711, 16:57  #1575 
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
We just passed 1,2 Billion numbers in the database.
Last fiddled with by MisterBitcoin on 20180711 at 16:57 
20180724, 17:40  #1576 
"Daniel Jackson"
May 2011
14285714285714285714
2^{3}·83 Posts 
DB size limit too small to add M77232917. Needs to be increased.
I know I probably asked for this already, but I think it's time to increase the global DB limits, and sooner than you think. Given that the largest known prime (M77232917) is far beyond the 62 megabit limit, is there any chance that the DB size limit could be increased to 332 megabits, or even higher? It's extremely annoying to have the message "Error: Limit of about 10.000.000 digits exceeded" come up when I try to add 2^772329171 (the largest known prime) to the DB. Also, that error is very inaccurate, since both 2^43112609 and 2^578851611 are greater than 10,000,000 digits, and I could add them to the DB without any problem. It should've said "Error: Limit of 18.663.860 digits (62.000.000 bits) exceeded". Also, the limits on factorials (450000!, 2,348,517 digits) and primorials (5000000#, 2,170,852 digits) are way too low. They should be as high as the limit for all other numbers. Here's what the limits should be:
Factorials: n=14842907. Overall limit: 332192810 bits. Primorials: whatever the largest primorial <100,000,000 digits is. N+1 (or N1) test for PRPs: 1000000 digits (put it in a worker queue, so as to reduce DB load) Eventually, any limit will be surpassed as larger and larger primes are discovered, so the DB should NOT have a maximum limit, anyway. When we do find the first billion digit prime (within the next few decades), we SHOULD be allowed to add it to the DB, without any errors at all. @Syd: I miss the "Magnifying Glass" feature. It allowed me to run P1 and ECM on composite numbers (to a certain limit), and SIQS on numbers <80 digits (I think it was 80. It's been so long since that feature was removed that I forget the max size on it.). without using any of my own PC's CPU time (which I use for other things, such as Prime95 stress testing). Why the heck was it removed in the first place? Could you PLEASE bring it back? That's been bothering me for quite a few years now. 
20180724, 18:40  #1577 
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
Computing power might be a problem. Maybe an RasPi System with 16 (or 32) RasPi is enough; each task is slow but more tasks can be started. (e.g. 5 for certificates; 4 for factoring below C=85 digits; 4 for checking "U" below 100000 digits, and so on)
Also FDB needs an better SSD, lets say ~1TB SSD plus 4 TB HDD for backup. I can help on that front; I can throw in a few hundret bucks (if needed). 
20180724, 19:26  #1578 
Mar 2018
129_{10} Posts 
Why would you want M77232917 on FactorDB in the first place? What purpose would that serve?
I'ld say, the limits are higher than they need to be, actually. And there is a very visible lack of perfomance on large numbers with a large amount of factors already. 
20180803, 18:55  #1579 
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
ECM pretests and FDB.
I´d like to see an new option: ECM pretest depth reached for composite numbers. (given by the Perl script for autom. Yafu processing.)
Is there any hope, that things like that will be available at some point? 
20180810, 10:48  #1580 
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
Quite a lot in the area from 95 up to 99 digits, I´ll start one worker on that range.

20180810, 19:00  #1581 
"Ed Hall"
Dec 2009
Adirondack Mtns
111011110001_{2} Posts 
I've been focused elsewhere the last few days. I'm sure that's allowed the buildup to be worse. I'm not sure when I may return to composites. Once we knocked the c100 group down a bit, I was less interested. Composite work is now more of a fallback for idle machines.

20180810, 19:10  #1582  
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
Quote:
I can extend my range from 90 up to 99 if needed; I can also run that range for a few weeks. My main focus will be searching for 2X and 3X digit factors on composite numbers with 121 digits. 

20180810, 20:09  #1583 
"Ed Hall"
Dec 2009
Adirondack Mtns
3825_{10} Posts 
Let's see how it goes for the next few days. I'm working on an Aliquot sequence and playing with ecmpi and cadonfs. The LA portion leaves all but one machine free. I hope to have them automatically run the db composites during that free time.

20180902, 12:43  #1584 
"Nuri, the dragon :P"
Jul 2016
Good old Germany
811 Posts 
So many small composites from Aliquot.
Trying to catchup. Still having 9099 digits, also starting 8189 now. 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Database for kbb's:  3.14159  Miscellaneous Math  325  20160409 17:45 
Factoring database issues  MiniGeek  Factoring  5  20090701 11:51 
database.zip  HiddenWarrior  Data  1  20040329 03:53 
Database layout  Prime95  PrimeNet  1  20030118 00:49 
Is there a performance database?  Joe O  Lounge  35  20020906 20:19 