mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Lone Mersenne Hunters (https://www.mersenneforum.org/forumdisplay.php?f=12)
-   -   Coordination thread for redoing P-1 factoring (https://www.mersenneforum.org/showthread.php?t=23152)

petrw1 2019-11-13 15:53

[QUOTE=masser;530466]I will pause on the 2.8M range and resume on the 2.6M range. I'm losing a few credits here and there because a few people (probably not on the forum) are also taking exponents in these ranges for P-1 with large bounds.

After I complete my first pass over 2.6M, I plan to do a second pass over 2.6M and 2.8M. Anyone have a suggestion for how much to increase the bounds for a second pass of P-1 factoring? I was thinking that I would use the probability calculator at mersenne.ca; currently I'm putting in about 0.5 GhzD/exponent. For the second pass, I thought I would increase that to 0.75 GhzD/exponent and use bounds that approx. maximize the probability of finding a factor.[/QUOTE]

I rely a lot on the probability calculator.
If you are a programmer you should consider getting the source and coding it yourself so it can process large batches of potential P-1 assignments and calculate the expected success rate and the GhzDays effort.

If you choose a few different percentages vs effort and graph them you will note that at some point it takes a LOT more work for a small percentage improvement.
I try to find good balance between expected factors found vs. GhzDays effort to find them.

In my experience multiple passes with increasing bounds is inefficient.
If you only use B1 then each successive Stage 1 run will continue where it left off (assuming you save the work files).
However, any time you increase B2 it runs the entire Stage 2 from the start whether or not you change B1.

I prefer to calculate the bounds that get the number of factors I want and use those bounds with one pass; starting with the exponents with the best odds.

masser 2019-11-13 17:55

Thanks, Wayne. That confirms what I was beginning to suspect about P-1 strategy. I think 2.6M will be very close to the 1999 unfactored goal after my initial pass is complete. If necessary, finishing that range with a second pass shouldn't be too difficult. I should be finished with 2.6M by the end of 2019.

2.8M will take a little longer with my measly i5, but I'll keep plugging away at it. The next pass over that range will be a little more strategic.

petrw1 2019-11-13 19:14

[QUOTE=masser;530494]Thanks, Wayne. That confirms what I was beginning to suspect about P-1 strategy. I think 2.6M will be very close to the 1999 unfactored goal after my initial pass is complete. If necessary, finishing that range with a second pass shouldn't be too difficult. I should be finished with 2.6M by the end of 2019.

2.8M will take a little longer with my measly i5, but I'll keep plugging away at it. The next pass over that range will be a little more strategic.[/QUOTE]

Cool!

:thumbs-up:

R.D. Silverman 2019-11-13 22:58

[QUOTE=masser;526678]I'm running P-1 with larger bounds on exponents in the 2.6M and 2.8M ranges.[/QUOTE]

It is, almost certainly, [b]not[/b] worth the effort. Compute, e.g. the conditional
probability of finding a factor given that the method failed with a prior given B1.

Extending P-1 just isn't worth the effort. It would be much better to run ECM
instead.

masser 2019-11-14 17:39

Dr. Silverman, thank you for the recommendation. I considered running ECM curves.

Here is an exponent I considered: [URL="https://www.mersenne.ca/exponent/2693501"]2693501[/URL]

P-1 has already been performed on this exponent with bounds, B1= 670000 and B2 = 6700000. The probability of finding a factor with those bounds was 0.03856. If I complete another P-1 calculation, with bounds, B1=2000000, B2=50000000, the probability of finding a factor (given current trial factoring limit, but assuming no prior P-1) will be 0.06294. I used the probability calculator [URL="https://www.mersenne.ca/prob.php?exponent=2693501&b1=2000000&b2=50000000&guess_saved_tests=&factorbits=68&K=1&C=-1"]here[/URL]. I estimate the conditional probability using subtraction (how bad is this?): 0.06294-0.03856 = 0.02438. The P-1 calculation will take 21 minutes on my machine and I expect to find a factor for every 41 (1/0.02438) exponents that I test with similar bounds. So, I expect to find 1 factor every 14.35 hours via P-1.

Each ECM curve for M2693501, at B1=50000 and B2=5000000, takes about 5 minutes on my machine. Completing 214 such curves will take about 18 hours. Would completing the 214 ECM curves at B1=50000 be comparable to trial factoring to 83 bits? That's how I came up with an estimate for the probability of finding a factor via ECM: 0.1139. With these crude estimates of the probability, I anticipate finding a factor via ECM in this range about once a week.

Experience has not born out the P-1 estimate above of one factor every 14 hours; it has been more like one factor every day. I note that there has been some ECM already performed on these exponents and I'm using very crude estimates. Please correct me if I'm doing anything wildly incorrect. However, at this point it appears that additional P-1 in the 2.6M range is more efficient at finding factors than ECM.

henryzz 2019-11-15 10:54

I believe that the conditional probability for the P-1 rerun should be (P(second) - P(first))/(1-P(first)). This assumes that the search space for the first run is completely contained within the second run(and is also quite possibly messed up by tf considerations). As such the search space is smaller for the second run. This provides a value of 0.03968 for your example 2693501.
These numbers do not take into account any ECM that has already been done. I am not sure what the average amount of ECM done so far on numbers this size is but 2693501 has 33 curves done. This is 33/280 of t25 and as such, there is a 1-e^(-33/280)=11.1% chance of any 25 digit factor having been found already by ECM. This will be higher for smaller factors of course.

Please feel free to correct my hastily done maths on this. It would be interesting to make a calculator that took all known work into account as well as the expected size of the factors and showed the probability of any new work finding a factor of x digits graphically. I might look into doing this if I can work out the formulas for P-1 and ECM probabilities.

R.D. Silverman 2019-11-15 18:04

[QUOTE=masser;530591]Dr. Silverman, thank you for the recommendation. I considered running ECM curves.

Here is an exponent I considered: [URL="https://www.mersenne.ca/exponent/2693501"]2693501[/URL]

P-1 has already been performed on this exponent with bounds, B1= 670000 and B2 = 6700000. The probability of finding a factor with those bounds was 0.03856. If I complete another P-1 calculation, with bounds, B1=2000000, B2=50000000, the probability of finding a factor (given current trial factoring limit, but assuming no prior P-1) will be 0.06294. I used the probability calculator [URL="https://www.mersenne.ca/prob.php?exponent=2693501&b1=2000000&b2=50000000&guess_saved_tests=&factorbits=68&K=1&C=-1"]here[/URL]. I estimate the conditional probability using subtraction (how bad is this?): 0.06294-0.03856 = 0.02438. The P-1 calculation will take 21 minutes on my machine and I expect to find a factor for every 41 (1/0.02438) exponents that I test with similar bounds. So, I expect to find 1 factor every 14.35 hours via P-1.

Each ECM curve for M2693501, at B1=50000 and B2=5000000, takes about 5 minutes on my machine. Completing 214 such curves will take about 18 hours. Would completing the 214 ECM curves at B1=50000 be comparable to trial factoring to 83 bits? .[/QUOTE]


You are comparing the wrong things.

I will accept your probabilities as correct. [your conditional probability computation
is not correct, however]. You need to divide by (1- P1)

Running P-1 with limits B1, B2 is the same as running a single elliptic curve.
[although the computations are simpler and hence faster] (one does need to adjust
the size of the candidate by log(exponent) because P-1 is always divisible by
the exponent.)

Running a second elliptic curve with the same limits will double the probability of
success. Increasing B1, B2 for P-1 does not double the probability of success
unless B1, B2 are [b]greatly[/b] increased.

You are comparing running P-1 with B1, B2 against running another ECM
curve with much SMALLER limits. And running ECM gives multiple independent
chances. P-1 does not.

Read my joint paper with Sam Wagstaff: : A Practical Analysis of ECM.

VBCurtis 2019-11-15 18:20

[QUOTE=R.D. Silverman;530678]You are comparing the wrong things.

You are comparing running P-1 with B1, B2 against running another ECM
curve with much SMALLER limits. And running ECM gives multiple independent
chances. P-1 does not.[/QUOTE]

Could you elaborate, specifically about which bounds for ECM will yield a better expected time per found factor than his P-1 work? The quoted part above seems to claim his choice of ECM bounds is too small to be a fair comparison, but your case to him is that he is wasting time doing P-1 when he should be doing ECM instead. What size ECM should he do that is more efficient than P-1?

Masser is comparing the rate of found factors by P-1 (at large bounds as stated) to the rate by ECM (at small bounds). If that's the wrong thing to compare, what is the right thing?

R.D. Silverman 2019-11-15 23:38

[QUOTE=VBCurtis;530683]Could you elaborate, specifically about which bounds for ECM will yield a better expected time per found factor than his P-1 work? The quoted part above seems to claim his choice of ECM bounds is too small to be a fair comparison, but your case to him is that he is wasting time doing P-1 when he should be doing ECM instead. What size ECM should he do that is more efficient than P-1?

Masser is comparing the rate of found factors by P-1 (at large bounds as stated) to the rate by ECM (at small bounds). If that's the wrong thing to compare, what is the right thing?[/QUOTE]

The data is all in my paper. Consider the following: P-1 to limits B1, B2 has been run.

Suppose we have the choice of (say) running P-1 again with (say) limits kB1, kB2
or running a elliptic curve with limits B1, B2. [for some k].

The latter choice will double the probability of success. The former will not. The former
will take [b]less time[/b], but the latter will give greater likelihood of success. The
former will allow more numbers to be tested in a fixed amount of time.

If we choose k to spend the same amount of time for extending P-1 or to run ECM
that latter should be more efficient

OTOH, if one places a BOUND on the time to be spent, i.e. one wants to spend
time T either running more curves or extending B1,B2 for P-1, it may be more
effective to extend P-1 because (as already noted) one must lower B1,B2 for the
elliptic curves because point addition on curves is much more expensive than simple
exponentiation.

I have not computed the effect that attempting 2^p - 1 has on P-1. The form of the
numbers gives that P-1 is always divisible by p. This effectively reduces the size
of the numbers that "must be smooth" by log(p). This may give a sufficient advantage
to make P-1 more effective when bounding the run time. Note that this advantage
disappears if one were attempting to factor numbers of no particular form.

If one is willing to spend an arbitrary amount of time on each candidate it is clear
that running multiple elliptic curves will be far more effective than raising the P-1
bounds, especially as the factors get larger.

masser 2019-11-16 18:22

Thank you all for the feedback. I have downloaded the Silverman-Wagstaff paper and will read it, but I will probably need to brush up on some of the background before I fully comprehend it. Thanks again.

masser 2019-11-28 06:16

New (to me) result feedback:

Splitting composite factor 72115741492408141371057158919540730748664584042639 into:
* 57330969015562354090032601
* 1257884573219624043651239

[URL="https://www.mersenne.org/report_exponent/?exp_lo=2645329&full=1"]https://www.mersenne.org/report_exponent/?exp_lo=2645329&full=1[/URL]

:huh:


All times are UTC. The time now is 05:10.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.