Quote:
Originally Posted by SethTro
Again thanks for the detailed report, why the "7 (or more)"?

I used 7 because I was able to run that many jobs on a machine with 128GB of ram. You'll need to calculate how much ram a job will take (with or without maxmem) and then figure out how many you can run in parallel.
Quote:
Originally Posted by SethTro

I don't really know of a writeup on the performance impact of using maxmem. The only thing I know is that it will reduce the B2 value and increase the k value (internal gmpecm variable), which will slightly decrease the odds of finding a factor and somewhat increase the stage2 runtime. You can use the v option to see the impact of various maxmem options. ie, Look for the "estimated number of curves to find a factor of n digits" table. This will let you know how many curves to run at the given B1 value to have a \(1  1/e^x\) (where x = curves_run/recommended_curves) chance of finding a factor of the specified size (if one exists).
For example, using B1=11e6 without maxmem will look like:
Code:
Using B1=11000000, B2=35133391030, polynomial Dickson(12), sigma=1:2580904028
dF=32768, k=3, d=324870, d2=11, i0=23
Expected number of curves to find a factor of n digits:
35 40 45 50 55 60 65
138 788 5208 39497 336066 3167410 3.2e+007
And with maxmem 20 it will look like:
Code:
Using B1=11000000, B2=28545931060, polynomial Dickson(12), sigma=1:3439101351
dF=8192, k=40, d=79170, d2=11, i0=128
Expected number of curves to find a factor of n digits:
35 40 45 50 55 60 65
142 813 5420 40914 348634 3290145 3.4e+007
And with maxmem 40 it will look like:
Code:
Using B1=11000000, B2=28544268490, polynomial Dickson(12), sigma=1:2669738918
dF=16384, k=10, d=158340, d2=11, i0=59
Expected number of curves to find a factor of n digits:
35 40 45 50 55 60 65
142 813 5420 40914 348634 3290145 3.4e+007
You can see that maxmem decreases the B2 value and increases the k value, so you get slightly lower odds of finding a factor and somewhat longer runtimes.
In this case if you ran 10000 curves at B1=11e6 with no, 20, or 40 maxmem, your odds of finding various sized factors would be:
Code:
Chance to find factor, of size d, = 1  1/e^x, where x = 10000/recommended_curves
d 35 40 45 50 55 60
no ~100% 99.9996% 85.341% 22.367% 2.931% 0.315%
20 ~100% 99.9995% 84.197% 21.683% 2.827% 0.303%
40 ~100% 99.9995% 84.197% 21.683% 2.827% 0.303%
Chance to miss factor, of size d, = 1/e^x, where x = 10000/recommended_curves
d 35 40 45 50 55 60
no ~0% 0.000308% 14.658% 77.632% 97.068% 99.684%
20 ~0% 0.000455% 15.802% 78.316% 97.172% 99.696%
40 ~0% 0.000455% 15.802% 78.316% 97.172% 99.696%
Quote:
Originally Posted by SethTro
I'm already using your wonderful script. I did have to make a couple of modifications to make it work with python3 (I'll try and post a patch later).

Thanks. I've been patching over the years, but haven't posted an update in a while. I'll post the update, with some of your changes, here in a bit.
Quote:
Originally Posted by SethTro
Since you seem to be actively working on this, what curves would be most useful for me to work on? I'm willing to spend a 1080 GPUmonth + a couple core years with high memory.

"Active" is subjective. I haven't really worked on this since Nov 2017. I think it was the 1080 that could complete 9600 stage1 curves with B1=3e9 in ~30 days. My computers are busy, so you'll have to verify this on your own machine. I'd recommend running with B1=3e9 or higher.