mersenneforum.org Python script for search for factors of M1277 using random k-intervals
 Register FAQ Search Today's Posts Mark Forums Read

 2020-09-11, 13:49 #12 firejuggler     Apr 2010 Over the rainbow 2·1,217 Posts From M1 to M500, 83 fully factored M(prime), not counting prime themselves. 30 have 2 factors. 17 have 3 factors. Therefore 36 have 4 or more factors. Last fiddled with by firejuggler on 2020-09-11 at 14:06
 2020-09-11, 14:04 #13 Uncwilly 6809 > 6502     """"""""""""""""""" Aug 2003 101×103 Posts 8,737 Posts Ok. Based upon the ECM data we are looking at 5 or fewer factors for M1277
 2020-09-23, 02:04 #14 storm5510 Random Account     Aug 2009 U.S.A. 22·34·5 Posts M1277 is 384 decimal digits long, if memory serves. If it were "smooth" as some here like to say, this might have been done and over a long time ago. Four LL tests says it is composite. A P-1 test back in 2017 used a (5-trillion + 3) B1 bound. The B2, (400-trillion + 241). One individual I am aware of spent months running Stage 1 ECM's with Prime95 and Stage 2 with GMP-ECM. Unless somebody could coax Google into trying this on their quantum supercomputer, M1277 will likely remain an enigma for quite some time to come. It must have one, or more, really large factors which we do not have the tech to reach currently.
2020-09-23, 03:40   #15
VBCurtis

"Curtis"
Feb 2005
Riverside, CA

104528 Posts

Quote:
 Originally Posted by storm5510 . It must have one, or more, really large factors which we do not have the tech to reach currently.
It has little to do with tech, lots to do with patience and willingness to pay the power bill.

There is still plenty of ECM to do on this number before it's "ready" for SNFS. Anyone can fire up curves at, say, B1 = 6e9 or bigger and have a go- and few of us would be surprised if someone found a factor in just that way. If another quarter million or so (I didn't actually calculate how many) such curves fail to find a factor, then we head off to SNFS when someone feels like starting it.

We have the tech to do SNFS on it right now, but not the patience. It would take a cluster to solve the matrix, but those exist too. The sieving is a really long task, which is why nobody has bothered to try (and also why more ECM is worth the effort)- but CADO can do it.

So, no, it's not true that we don't have the tech.

2020-09-23, 13:55   #16
storm5510
Random Account

Aug 2009
U.S.A.

31248 Posts

Quote:
 Originally Posted by VBCurtis It has little to do with tech, lots to do with patience and willingness to pay the power bill. There is still plenty of ECM to do on this number before it's "ready" for SNFS. Anyone can fire up curves at, say,B1 = 6e9 or bigger and have a go- and few of us would be surprised if someone found a factor in just that way. If another quarter million or so (I didn't actually calculate how many) such curves fail to find a factor, then we head off to SNFS when someone feels like starting it. We have the tech to do SNFS on it right now, but not the patience. It would take a cluster to solve the matrix, but those exist too. The sieving is a really long task, which is why nobody has bothered to try (and also why more ECM is worth the effort)- but CADO can do it. So, no, it's not true that we don't have the tech.

6,000,000,000 for B1. I believe the rule-of-thumb is B2 = B1 * 100. Of course, that is not set in stone. A person could go higher if they choose.

I have an older machine that sits in a corner I could do this with. It is not fast or elegant but it gets the job done. It still has the 29.x version of Prime95 on it. I could let this run for months, even years. I will give this a go. The only requirement is to not let it run out of work.

 2020-09-23, 15:02 #17 VBCurtis     "Curtis" Feb 2005 Riverside, CA 104528 Posts Actually, I assumed GMP-ECM for the runs, as it is dramatically faster than P95 for this very small (by GIMPS standards) number. You'll find that B2 is much much more than 100* B1 when using GMP-ECM. Memory use is also higher, though. There's another M1277 thread where the procedure for using P95 for stage 1 and GMP-ECM for stage 2 is laid out- that's the fastest way for this number. If you're doing more than a handful of curves, I strongly suggest you use GMP-ECM (windows or linux) for stage 2. I think I ran 500 curves in just this way a couple years back; I left an old Core 2 Quad on it for a few months.
2020-09-23, 15:27   #18
storm5510
Random Account

Aug 2009
U.S.A.

22×34×5 Posts

Quote:
 Originally Posted by VBCurtis Actually, I assumed GMP-ECM for the runs, as it is dramatically faster than P95 for this very small (by GIMPS standards) number. You'll find that B2 is much much more than 100* B1 when using GMP-ECM. Memory use is also higher, though. There's another M1277 thread where the procedure for using P95 for stage 1 and GMP-ECM for stage 2 is laid out- that's the fastest way for this number. If you're doing more than a handful of curves, I strongly suggest you use GMP-ECM (windows or linux) for stage 2. I think I ran 500 curves in just this way a couple years back; I left an old Core 2 Quad on it for a few months.
In my case, it is a 3 GHz Core2Duo. I am familiar with the GMP-ECM procedure. A line in prime.txt needs to be GmpEcmHook=1 to generate the files needed for stage 2. This old machine only has 4 GB of RAM. I am not sure if this would be enough. I would simply have to try it and see how it behaves. I would not necessarily have to run GMP-ECM there. I could do it on another machine with a lot more RAM. I probably have the entire setup in an archive here somewhere.

This will give me something to do and to think about. I hit the big 65 in 13 days. Thank you for the feedback.

 2020-09-23, 15:34 #19 kruoli     "Oliver" Sep 2017 Porta Westfalica, DE 32·37 Posts If memory is not sufficient, you could try ecm -maxmem 3072. That will limit it to 3 GB of RAM usage. For me, GMPECM reports Estimated memory usage: 8.08GB with B2 = 51,985,969,455,438 (ECM-default at B1 = 2e9). For ecm -maxmem 3072 it says Estimated memory usage: 1.93GB, so this should be totally fine for that system.
 2020-09-23, 15:42 #20 kruoli     "Oliver" Sep 2017 Porta Westfalica, DE 32×37 Posts Sorry, I went for the wrong B1. For me, GMPECM reports Estimated memory usage: 16.50GB with B2 = 262,752,699,834,252 (ECM-default at B1 = 6e9). For ecm -maxmem 3072 it says Estimated memory usage: 1.93GB, so this should be totally fine for that system.
2020-09-23, 17:49   #21
storm5510
Random Account

Aug 2009
U.S.A.

22·34·5 Posts

Quote:
 Originally Posted by kruoli Sorry, I went for the wrong B1. For me, GMPECM reports Estimated memory usage: 16.50GB with B2 = 262,752,699,834,252 (ECM-default at B1 = 6e9). For ecm -maxmem 3072 it says Estimated memory usage: 1.93GB, so this should be totally fine for that system.
I needed to go back a look through the parameters again anyway. I was unaware RAM usage could be restricted. Thanks!

14 hours for each stage one curve on that machine with B1 = 6e9. I am loading it in groups of 10, a single curve in each work line. 5.8 days for the group. Then I will go to GMP-ECM. Once finished, repeat the process.

 2020-09-23, 20:35 #22 kruoli     "Oliver" Sep 2017 Porta Westfalica, DE 33310 Posts Since you have a dual core CPU, you should be able to increase efficiency by doing stage 1 of a set $$A$$ and stage 2 of a set $$B$$ in parallel, if you like! Prime95's parallelization is not efficient at all at those tiny FFTs. The OpenMP functionality of GMP-ECM only works in stage 2 currently (I'd like to be corrected), but only helps in certain sub-steps.

 Similar Threads Thread Thread Starter Forum Replies Last Post ixfd64 Software 0 2019-12-07 09:12 Ghost Information & Answers 4 2018-11-30 04:07 DanielBamberger Data 17 2018-01-28 04:21 Orgasmic Troll Miscellaneous Math 7 2006-06-11 15:38 mfgoode Math 20 2006-02-05 02:09

All times are UTC. The time now is 15:01.

Tue Oct 27 15:01:24 UTC 2020 up 47 days, 12:12, 1 user, load averages: 3.30, 3.14, 2.98