![]() |
![]() |
#45 | |
Dec 2004
13×23 Posts |
![]() Quote:
I suggest you try to stick with that recommendation and use the following technique to circumvent the memory limitation. You can setup the B1 B2 bounds as Bob suggested where b1 takes slightly more time than b2, say 2%. Using the command line save the residue from stage 1 and start a second stage 1 curve with the first processor. With the second processor run the stage 2 curve on the stage 1 residue, after several day you may build up a few extra stage 1's but I'm sure you can deal with that. I would think that most 4 processor machines have at least 4G of memory so curves taking more than 2G??? Those must be some high bounds... |
|
![]() |
![]() |
#46 | |
Jul 2004
Potsdam, Germany
3·277 Posts |
![]()
I tried the mingw version of gwnum - and failed. I just realized that, unfortunately, the output that corrupted. I hope this piece of it is enough for help. I currently don't have access to a build system:
Quote:
|
|
![]() |
![]() |
#47 | |
Jul 2004
Potsdam, Germany
14778 Posts |
![]()
Compiling gwnum also did not work:
Quote:
|
|
![]() |
![]() |
#48 | |
P90 years forever!
Aug 2002
Yeehaw, FL
1FD716 Posts |
![]() Quote:
|
|
![]() |
![]() |
#49 |
"Bo Chen"
Oct 2005
Wuhan,China
2678 Posts |
![]()
I want to know why the B2 always changed when the new version coming.
and how many curves should run with the B1 and default B2. Code:
some test data: GMP-ECM 6.0.1 [powered by GMP 4.1.4] [ECM] Input number is 162385812809900583261295597372983559948698484946321658540735656202018003505845104737134070357362269119992033891923896707742660065148483572244656969761256208962241824944638737217461295395400432531624931235045560966776497496089919 (228 digits) Using B1=11000000, B2=25577181640, polynomial Dickson(12), sigma=894268038 Step 1 took 556908ms Step 2 took 194815ms BotXXX P4 - GMP-ECM 6.1-beta2 [powered by GMP 4.2] [ECM] Input number is (10^239-1)/9/479/142847911 (228 digits) Using B1=11000000, B2=35133391030, polynomial Dickson(12), sigma=2918134401 Step 1 took 492485ms Step 2 took 204422ms Last fiddled with by wreck on 2006-04-18 at 10:23 |
![]() |
![]() |
#50 |
Dec 2004
29910 Posts |
![]()
Well your result looks pretty good nonetheless.
10% faster in stage 1 with a B2 increase of 40% for pretty much the same time input. ![]() |
![]() |
![]() |
#51 | ||
Jul 2004
Potsdam, Germany
3×277 Posts |
![]() Quote:
Quote:
|
||
![]() |
![]() |
#52 | |
"Bob Silverman"
Nov 2003
North of Boston
2×33×139 Posts |
![]() Quote:
to get across. The response surface (for probability of success as a function of B1, B2, #curves) is VERY shallow in the neighborhood of the optimum. It DOES NOT REALLY MATTER if one varies B2 from the 'true' optimum. One can spend CPU time by increasing B2. Or one can reduce B2 and run more curves in the same amount of time. The probability of overall success is relatively INSENSITIVE to this choice. [as long as you are not too far off in selection of B1]. Select B1. Determine how much time you want to spend. Period. Don't worry about B2. My paper with Sam Wagstaff: "A Practical Analysis of The Elliptic Curve Factoring Algorithm" should be mandatory reading for anyone who wants to worry about how to select the parameters. |
|
![]() |
![]() |
#53 | |
Aug 2002
3×52×7 Posts |
![]() Quote:
http://www.mersenneforum.org/showthr...urve#post77078 |
|
![]() |
![]() |
#54 | |
Jul 2004
Potsdam, Germany
3·277 Posts |
![]() Quote:
We could promote the mersennewiki and enhance the "Choosing the best parameters for ECM" section. It is important to remind oneself that parameter choices of ECM are not as critical as they are for e.g. NFS. On the other hand, I found non-negligible speedups especially for base-2 factorizations when step1 is performed with prime95/mprime. For B1=11e7 (which is a rather extreme case), the optimal B2 parameter gives me a performance increase of over 30%. For B1=250K, it went down to ~7.5%. Using only gmp-ecm, the improvement is much smaller. In my opinion, here, it only makes sense for "bigger" factorizations (e.g. B1 >= 11M), where even a speedup of some percent add up to hours or even days. |
|
![]() |
![]() |
#55 | |
"Bob Silverman"
Nov 2003
North of Boston
2×33×139 Posts |
![]() Quote:
This is gibberish at worst and misleading at best. The term "performance increase" is not what is being measured here. When you say that for B1 = 110M the optimal B2 improves "performance" by 30% do you mean that your probability of finding a 55 digit factor increased by 30% over choosing a smaller B2 but with more curves???? [i.e. the time you spent is held constant]? 30% is much too large. The response surface is too flat. I do not believe this number. Where did you get it? I doubt this is what you mean. We have not FOUND enough 55 digit factors to even MAKE this kind of measurement. You say that a speedup " of some percent add up to hours or even days." This is MEANINGLESS because (1) You certainly do not have enough data to be able to say that an optimally chosen B2 will save "days" in finding a p55. How many p55's have you found? The time it takes to find even one will be a random variable with fairly high variance. It will take a lot of successes to determine a difference between the mean time for two different B2 values. The entire effort of the entire world has not collected enough data. (2) What does "days" mean? How many machines etc. You are being very imprecise in your terminology and it will be highly misleading to the naiive user. Certainly changing B2 will affect the run time for each curve. But for a fixed amount of time, reducing B2 allows more curves to be run. Let N be the number of curves you run with optimal B2 and let N' be the number of curves you run with sub-optimal B2'. Let P(B1, B2) be the probability of success *per curve*. Then the respective overall probabilities will be 1 - (1 - P(B1,B2))^N and 1 - (1 - P(B1, B2'))^N'. If you spend the same amount of time with each set of parameters, the ratio of the probabilities will be very close to 1 if B1 is chosen appropriately. They will certainly NOT differ by 30% and you can't possibly have collected enough data to justify your claim of 30%. How many 55 digit factors have you found? |
|
![]() |