20230205, 00:33  #1 
Sep 2022
2^{2}·3·7 Posts 
Getting under 1k unfactored exponents in 0.0M
I had noticed that whilst the main goals of factoring milestones have been to get under 20M unfactored for 0G range, under 2M per 100M range, etc etc, we are relatively close to getting under 1k unfactored in 0.0M. I think it would be worth some effort to reach this milestone. When it is reached it would be quite the achievement I think. 32 exponents remain which for a range with this much effort already expended will be quite the undertaking. If anyone wants to help, I appreciate it. Currently I am extending t40 from 0.05M to 0.06M but will probably keep extending that beyond there. Not sure if extending t40 to 0.1M will be enough on average to get the remaining 32 factors but I suspect it won't be.
Anyway, yeah I probably can't do it alone so if anyone wants to help, I would appreciate it a lot. 
20230205, 00:40  #2 
Jul 2003
Behind BB
2^{2}·3^{2}·5·11 Posts 
In the next few weeks, I plan to run P1 on those exponents in 60k < p < 100k with B2 < 1e13. (only about 30 exponents, but there's a chance of a factor there)
Last fiddled with by masser on 20230205 at 00:41 
20230206, 04:24  #3 
"Curtis"
Feb 2005
Riverside, CA
2×3×953 Posts 
I haven't played with the modern P95 ECM speedup yet, so I'll use this thread's goal to familiarize myself with the new software. I plan to do a pass of nottoomany curves across a wide range of inputs, e.g. 200 @ B1=3e6 or 100 @ 5e6.
Once I run a few, I'll post the region where I'm playing; ideally I'll work behind Masser's P1 effort but above 60k to avoid overlapping with OP in the short run. 
20230206, 05:40  #4 
Sep 2022
2^{2}·3·7 Posts 
Okay. I recently upped my B2 to try and force Prime95 to pick a consistent B2, which had the consequence of extending the time taken per exponent per worker from ~55 hours to ~80 hours. It didn't quite work as expected, but at least my B2 is now always above the usual 99324315090 (B1= 11e6).

20230206, 14:34  #5 
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
5282_{10} Posts 
Be warned that Ryan Propper did a ton of ECM on low exponents a couple years ago.
But good luck 
20230206, 16:51  #6 
Jul 2003
Behind BB
2^{2}×3^{2}×5×11 Posts 
FYI  I attached a chart of the current P1 bounds for p < 100k. The visual representation helps me understand the problem space a little better. I wonder how difficult it would be to develop something similar for ECM tlevels?

20230206, 18:33  #7 
"Oliver"
Sep 2017
Porta Westfalica, DE
1436_{10} Posts 
Easy  when you only take the t levels from mersenne.org or mersenne.ca with a spider.
But really hard, beside other reasons, because a lot of work is not known, as Wayne pointed to. 
20230206, 18:41  #8 
"Curtis"
Feb 2005
Riverside, CA
13126_{8} Posts 
I'm lazy and other people are quite skilled with the database... what size factors have been reported from ECM on 50100k exponents? We don't need Ryan's detailed reports if we have a list of his factors; we can deduce the rough amount of ECM he did and step to the nexthigher bound.

20230206, 20:11  #9  
Dec 2021
2^{3}×7 Posts 
Quote:


20230207, 13:34  #10 
Dec 2022
2·3^{2}·17 Posts 
This seems a feasible goal, and ECM alone would _eventually_ achieve it. As just pointed out, there are no Ryan Propper factors in the important part of the range, so the Tlevels given here should be fairly close.
However, I must raise some issues about the P1. This range has had a lot of redundant P1 already (e.g. M97429) and the factor probabilities must take that into account. Finding exponents with the worst P1 is not a terrible idea in itself but a reasonable chance at finding a factor is desired. As there are so few exponents down here, it is reasonable to spend more time on each rather than attempt to do a large number quickly as may be optimal in much higher ranges. The bounds masser has been using yield (using the calculator) factor probabilities always less than 1% even without considering the previous P1 that it is significantly redundant with. Although perhaps surprising at first, B1, always more important than B2, becomes more so as ECM level increases. This can be seen on the calculator and also through an intuitive argument: you need a (p1) smooth factor, smoother the larger it must be; compared to random integers, smooth ones are more likely to have a smaller ratio between their largest and secondlargest factors, hence the B1 and B2 necessary. Thus, P1 done with the old algorithm is not negligible if B1 was high as it often was. I would argue that the minimum B1 that should be considered is 10G for unfactored exponents in this range (nordi and George found almost all their factors here on already factored exponents, which have had less ECM), which still takes less than a day on hardware most of you are likely to have  and I still consider that relatively short. My own record of relative times shows the ECM required to reach t40 (including that already done) would be at least 50x that. B1=100G would be even better and if that were completed in 20100K should find 20 factors, while 10G may give 10 or fewer. Somewhere between might be best. There is no hurry here and it will likely take years to achieve the goal anyway. Rushed P1 that has been or soon will be largely duplicated anyway is not the best strategy. 
20230207, 18:45  #11 
P90 years forever!
Aug 2002
Yeehaw, FL
10000000000010_{2} Posts 
I have P1 save files for exponents from 60K to 100K. For unfactored exponents B1=3G in most cases. If anyone wants them, just ask.
I'm going to try taking some of the larger exponents to t40. It should be a good test of the new ECM stage 2 code. It is also a good way to use my old quad cores that are memory bandwidth limited. Putting the 4th core on a PRP test improves the iteration times only a little. Instead, I'll put the 4th core on small ECM which should use L2 cache for stage 1. More time is spent in stage 1 than stage 2, so it won't slow down the PRP test much (I hope) due to additional memory bandwidth. 
Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Getting under 20k unfactored exponents in 229m  aperson1  Data  11  20230222 01:36 
COMPLETE!!!! Thinking out loud about getting under 20M unfactored exponents  petrw1  Data  1416  20230203 23:01 
Getting <2k unfactored exponents for 108.3M  Zhangrc  Data  63  20221028 09:31 
Unreserving exponents(these exponents haven't been done)  jasong  Marin's Mersennearies  7  20061222 21:59 
Question on unfactored numbers...  WraithX  GMPECM  1  20060319 22:16 