![]() |
![]() |
#1 |
Sep 2022
22×3×7 Posts |
![]()
I had noticed that whilst the main goals of factoring milestones have been to get under 20M unfactored for 0G range, under 2M per 100M range, etc etc, we are relatively close to getting under 1k unfactored in 0.0M. I think it would be worth some effort to reach this milestone. When it is reached it would be quite the achievement I think. 32 exponents remain which for a range with this much effort already expended will be quite the undertaking. If anyone wants to help, I appreciate it. Currently I am extending t40 from 0.05M to 0.06M but will probably keep extending that beyond there. Not sure if extending t40 to 0.1M will be enough on average to get the remaining 32 factors but I suspect it won't be.
Anyway, yeah I probably can't do it alone so if anyone wants to help, I would appreciate it a lot. |
![]() |
![]() |
![]() |
#2 |
Jul 2003
Behind BB
22·32·5·11 Posts |
![]()
In the next few weeks, I plan to run P-1 on those exponents in 60k < p < 100k with B2 < 1e13. (only about 30 exponents, but there's a chance of a factor there)
Last fiddled with by masser on 2023-02-05 at 00:41 |
![]() |
![]() |
![]() |
#3 |
"Curtis"
Feb 2005
Riverside, CA
52·229 Posts |
![]()
I haven't played with the modern P95 ECM speedup yet, so I'll use this thread's goal to familiarize myself with the new software. I plan to do a pass of not-too-many curves across a wide range of inputs, e.g. 200 @ B1=3e6 or 100 @ 5e6.
Once I run a few, I'll post the region where I'm playing; ideally I'll work behind Masser's P-1 effort but above 60k to avoid overlapping with OP in the short run. |
![]() |
![]() |
![]() |
#4 |
Sep 2022
22×3×7 Posts |
![]()
Okay. I recently upped my B2 to try and force Prime95 to pick a consistent B2, which had the consequence of extending the time taken per exponent per worker from ~55 hours to ~80 hours. It didn't quite work as expected, but at least my B2 is now always above the usual 99324315090 (B1= 11e6).
|
![]() |
![]() |
![]() |
#5 |
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
10100101010002 Posts |
![]()
Be warned that Ryan Propper did a ton of ECM on low exponents a couple years ago.
But good luck |
![]() |
![]() |
![]() |
#6 |
Jul 2003
Behind BB
22·32·5·11 Posts |
![]()
FYI - I attached a chart of the current P-1 bounds for p < 100k. The visual representation helps me understand the problem space a little better. I wonder how difficult it would be to develop something similar for ECM t-levels?
|
![]() |
![]() |
![]() |
#7 |
"Oliver"
Sep 2017
Porta Westfalica, DE
144710 Posts |
![]()
Easy - when you only take the t levels from mersenne.org or mersenne.ca with a spider.
But really hard, beside other reasons, because a lot of work is not known, as Wayne pointed to. |
![]() |
![]() |
![]() |
#8 |
"Curtis"
Feb 2005
Riverside, CA
52·229 Posts |
![]()
I'm lazy and other people are quite skilled with the database... what size factors have been reported from ECM on 50-100k exponents? We don't need Ryan's detailed reports if we have a list of his factors; we can deduce the rough amount of ECM he did and step to the next-higher bound.
|
![]() |
![]() |
![]() |
#9 | |
Dec 2021
3×19 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
#10 |
Dec 2022
13916 Posts |
![]()
This seems a feasible goal, and ECM alone would _eventually_ achieve it. As just pointed out, there are no Ryan Propper factors in the important part of the range, so the T-levels given here should be fairly close.
However, I must raise some issues about the P-1. This range has had a lot of redundant P-1 already (e.g. M97429) and the factor probabilities must take that into account. Finding exponents with the worst P-1 is not a terrible idea in itself but a reasonable chance at finding a factor is desired. As there are so few exponents down here, it is reasonable to spend more time on each rather than attempt to do a large number quickly as may be optimal in much higher ranges. The bounds masser has been using yield (using the calculator) factor probabilities always less than 1% even without considering the previous P-1 that it is significantly redundant with. Although perhaps surprising at first, B1, always more important than B2, becomes more so as ECM level increases. This can be seen on the calculator and also through an intuitive argument: you need a (p-1) smooth factor, smoother the larger it must be; compared to random integers, smooth ones are more likely to have a smaller ratio between their largest and second-largest factors, hence the B1 and B2 necessary. Thus, P-1 done with the old algorithm is not negligible if B1 was high as it often was. I would argue that the minimum B1 that should be considered is 10G for unfactored exponents in this range (nordi and George found almost all their factors here on already factored exponents, which have had less ECM), which still takes less than a day on hardware most of you are likely to have - and I still consider that relatively short. My own record of relative times shows the ECM required to reach t40 (including that already done) would be at least 50x that. B1=100G would be even better and if that were completed in 20-100K should find 20 factors, while 10G may give 10 or fewer. Somewhere between might be best. There is no hurry here and it will likely take years to achieve the goal anyway. Rushed P-1 that has been or soon will be largely duplicated anyway is not the best strategy. |
![]() |
![]() |
![]() |
#11 |
P90 years forever!
Aug 2002
Yeehaw, FL
2×4,099 Posts |
![]()
I have P-1 save files for exponents from 60K to 100K. For unfactored exponents B1=3G in most cases. If anyone wants them, just ask.
I'm going to try taking some of the larger exponents to t-40. It should be a good test of the new ECM stage 2 code. It is also a good way to use my old quad cores that are memory bandwidth limited. Putting the 4th core on a PRP test improves the iteration times only a little. Instead, I'll put the 4th core on small ECM which should use L2 cache for stage 1. More time is spent in stage 1 than stage 2, so it won't slow down the PRP test much (I hope) due to additional memory bandwidth. |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Getting under 20k unfactored exponents in 229m | aperson1 | Data | 11 | 2023-02-22 01:36 |
COMPLETE!!!! Thinking out loud about getting under 20M unfactored exponents | petrw1 | Data | 1416 | 2023-02-03 23:01 |
Getting <2k unfactored exponents for 108.3M | Zhangrc | Data | 63 | 2022-10-28 09:31 |
Unreserving exponents(these exponents haven't been done) | jasong | Marin's Mersenne-aries | 7 | 2006-12-22 21:59 |
Question on unfactored numbers... | WraithX | GMP-ECM | 1 | 2006-03-19 22:16 |