mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Factoring (https://www.mersenneforum.org/forumdisplay.php?f=19)
-   -   Python script for search for factors of M1277 using random k-intervals (https://www.mersenneforum.org/showthread.php?t=25941)

firejuggler 2020-09-11 13:49

From M1 to M500, 83 fully factored M(prime), not counting prime themselves. 30 have 2 factors. 17 have 3 factors. Therefore 36 have 4 or more factors.

Uncwilly 2020-09-11 14:04

Ok. Based upon the ECM data we are looking at 5 or fewer factors for M[M]1277[/M]

storm5510 2020-09-23 02:04

[URL="https://www.mersenne.org/report_exponent/?exp_lo=1277&exp_hi=&full=1"]M1277[/URL] is 384 decimal digits long, if memory serves. If it were "smooth" as some here like to say, this might have been done and over a long time ago. Four LL tests says it is composite. A P-1 test back in 2017 used a (5-trillion + 3) B1 bound. The B2, (400-trillion + 241).

One individual I am aware of spent months running Stage 1 ECM's with [I]Prime95[/I] and Stage 2 with [I]GMP-ECM[/I]. Unless somebody could coax Google into trying this on their quantum supercomputer, M1277 will likely remain an enigma for quite some time to come. It must have one, or more, really large factors which we do not have the tech to reach currently.

VBCurtis 2020-09-23 03:40

[QUOTE=storm5510;557611]. It must have one, or more, really large factors which we do not have the tech to reach currently.[/QUOTE]

It has little to do with tech, lots to do with patience and willingness to pay the power bill.

There is still plenty of ECM to do on this number before it's "ready" for SNFS. Anyone can fire up curves at, say, B1 = 6e9 or bigger and have a go- and few of us would be surprised if someone found a factor in just that way. If another quarter million or so (I didn't actually calculate how many) such curves fail to find a factor, then we head off to SNFS when someone feels like starting it.

We have the tech to do SNFS on it right now, but not the patience. It would take a cluster to solve the matrix, but those exist too. The sieving is a really long task, which is why nobody has bothered to try (and also why more ECM is worth the effort)- but CADO can do it.

So, no, it's not true that we don't have the tech.

storm5510 2020-09-23 13:55

[QUOTE=VBCurtis;557614][COLOR=Gray]It has little to do with tech, lots to do with patience and willingness to pay the power bill.

There is still plenty of ECM to do on this number before it's "ready" for SNFS.[/COLOR] Anyone can fire up curves at, say,B1 = 6e9 or bigger and have a go- and few of us would be surprised if someone found a factor in just that way. [COLOR=Gray]If another quarter million or so (I didn't actually calculate how many) such curves fail to find a factor, then we head off to SNFS when someone feels like starting it. [/COLOR]

[COLOR=gray]We have the tech to do SNFS on it right now, but not the patience. It would take a cluster to solve the matrix, but those exist too. The sieving is a really long task, which is why nobody has bothered to try (and also why more ECM is worth the effort)- but CADO can do it.

[/COLOR][COLOR=gray] So, no, it's not true that we don't have the tech.[/COLOR][/QUOTE]


6,000,000,000 for B1. I believe the rule-of-thumb is B2 = B1 * 100. Of course, that is not set in stone. A person could go higher if they choose.

I have an older machine that sits in a corner I could do this with. It is not fast or elegant but it gets the job done. It still has the 29.x version of [I]Prime95[/I] on it. I could let this run for months, even years. I will give this a go. The only requirement is to not let it run out of work. :smile:

VBCurtis 2020-09-23 15:02

Actually, I assumed GMP-ECM for the runs, as it is dramatically faster than P95 for this very small (by GIMPS standards) number. You'll find that B2 is much much more than 100* B1 when using GMP-ECM. Memory use is also higher, though.

There's another M1277 thread where the procedure for using P95 for stage 1 and GMP-ECM for stage 2 is laid out- that's the fastest way for this number. If you're doing more than a handful of curves, I strongly suggest you use GMP-ECM (windows or linux) for stage 2.

I think I ran 500 curves in just this way a couple years back; I left an old Core 2 Quad on it for a few months.

storm5510 2020-09-23 15:27

[QUOTE=VBCurtis;557648]Actually, I assumed GMP-ECM for the runs, as it is dramatically faster than P95 for this very small (by GIMPS standards) number. You'll find that B2 is much much more than 100* B1 when using GMP-ECM. Memory use is also higher, though.

There's another M1277 thread where the procedure for using P95 for stage 1 and GMP-ECM for stage 2 is laid out- that's the fastest way for this number. If you're doing more than a handful of curves, I strongly suggest you use GMP-ECM (windows or linux) for stage 2.

I think I ran 500 curves in just this way a couple years back; I left an old Core 2 Quad on it for a few months.[/QUOTE]

In my case, it is a 3 GHz Core2Duo. I am familiar with the GMP-ECM procedure. A line in [I]prime.txt[/I] needs to be [C]GmpEcmHook=1[/C] to generate the files needed for stage 2. This old machine only has 4 GB of RAM. I am not sure if this would be enough. I would simply have to try it and see how it behaves. I would not necessarily have to run GMP-ECM there. I could do it on another machine with a lot more RAM. I probably have the entire setup in an archive here somewhere.

This will give me something to do and to think about. I hit the big 65 in 13 days. [U]Thank you for the feedback[/U]. :smile:

kruoli 2020-09-23 15:34

If memory is not sufficient, you could try [c]ecm -maxmem 3072[/c]. That will limit it to 3 GB of RAM usage.

For me, GMPECM reports [c]Estimated memory usage: 8.08GB[/c] with B2 = 51,985,969,455,438 (ECM-default at B1 = 2e9). For [c]ecm -maxmem 3072[/c] it says [c]Estimated memory usage: 1.93GB[/c], so this should be totally fine for that system.

kruoli 2020-09-23 15:42

Sorry, I went for the wrong B1.

For me, GMPECM reports [c]Estimated memory usage: 16.50GB[/c] with B2 = 262,752,699,834,252 (ECM-default at B1 = [B]6e9[/B]). For [c]ecm -maxmem 3072[/c] it says [c]Estimated memory usage: 1.93GB[/c], so this should be totally fine for that system.

storm5510 2020-09-23 17:49

[QUOTE=kruoli;557653]Sorry, I went for the wrong B1.

For me, GMPECM reports [c]Estimated memory usage: 16.50GB[/c] with B2 = 262,752,699,834,252 (ECM-default at B1 = [B]6e9[/B]). For [c]ecm -maxmem 3072[/c] it says [c]Estimated memory usage: 1.93GB[/c], so this should be totally fine for that system.[/QUOTE]

I needed to go back a look through the parameters again anyway. I was unaware RAM usage could be restricted. Thanks!

14 hours for each stage one curve on that machine with B1 = 6e9. I am loading it in groups of 10, a single curve in each work line. 5.8 days for the group. Then I will go to GMP-ECM. Once finished, repeat the process.

kruoli 2020-09-23 20:35

Since you have a dual core CPU, you should be able to increase efficiency by doing stage 1 of a set [$]A[/$] and stage 2 of a set [$]B[/$] in parallel, if you like! :smile:

Prime95's parallelization is not efficient at all at those tiny FFTs. The OpenMP functionality of GMP-ECM only works in stage 2 currently (I'd like to be corrected), but only helps in certain sub-steps.


All times are UTC. The time now is 06:06.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.