[QUOTE=LaurV;560498]Unless you consider yourself extremely lucky guy (like we Romanians would say, you stepped on a shit or put your hand in it, or one shit fell on your head, or [URL="https://www.premierhotels.co.za/7_goodlucksuperstitions/"]something[/URL]).[/QUOTE]
We Canucks say: "Stepped in a pile of :poop: and came out smelling like a Rose" 
[QUOTE=LaurV;560561]I downloaded it and deleted it without launch (why the exe only, and not the sources?  rhetoric question, no need answer).[/QUOTE]
One word: [U]Windows[/U]. There appears to a Linux variant [URL="https://download.mersenne.ca/factor5/factorj_1.01.zip"]here[/URL]. After some pondering, I remembered Luigi wrote this for [B]James Heinrich[/B] for use in his TF>1000M project. [I]mfaktc[/I] is limited to exponents no larger than 2^321. James wanted something to filter out composites beyond. I believe his target was somewhere around 10billion. 
[QUOTE=storm5510;560554]I believe M1277 has been factored to 2^67. I have both the old and the new [I]factor5[/I] here somewhere. The new doesn't know from 2^? It uses [I]k[/I] only. Somewhere in my notebook I keep, there is a conversion formula I scratched down a long time ago. Someone here presented the formula. Who or when, I cannot remember.
After doing a little searching in my notes, I believe I found it. It looks like so: [B]k = 2[SUP]x[/SUP] / 2p[/B]. The x would be replaced with 67 then 68. 2p is 2*1277. If this is correct, the lower [I]k[/I] would be 57,781,500,622,426,160, and the upper is 115,563,001,244,852,320. I do not believe it multithreads, so it could take decades, if not centuries, to run. I found the archive. It is attached below if anyone wants to mess with it. It produces a results file, but no interim screen output. [U]Luigi Morelli[/U] wrote this in 2018. I believe many here know who Luigi is.[/QUOTE] Nah, no decades, and certainly not centuries. The range 6768 bits for M1277 represents 23407 GHzD/D, which if the mfaktc supported so low exponents, could be done in under a week with 2080Ti (possibly, depends on what the throughput would be). Unless we are talking specifically about this program, then maybe. 
It really doesn't matter how long it takes any program to get to 68 or 69 bits, when ECM has ruled out factors below ~6768 DIGITS.
LaurV is just trolling. 
[QUOTE=VBCurtis;560605]... when ECM has ruled out factors below ~6768 DIGITS.
[/QUOTE] Do we know a specific biggest factor digitsize that is for sure ruled out? Or at least the 1 in toomuchtomatter probability there is an unnoticed factor? 
[QUOTE=Viliam Furik;560608]Do we know a specific biggest factor digitsize that is for sure ruled out? Or at least the 1 in toomuchtomatter probability there is an unnoticed factor?[/QUOTE]
It's ECM; that number is always fuzzy. When a Tlevel has been completed, there is a 1/e chance of missing a factor of that size (if one exists). So, when T65 is done, a 65digit factor is missed 1/e of the time. By the time 2T65 is complete, the chance of a missed 65digit factor is (1/e)^2. We have done far more than T65 on M1277; I don't have the exact count, but I imagine somewhere around half a T70 = 3*T65. So, a 65digit factor or smaller can be ruled out with something like 1(1/e)^3 certainty, and a 67digit factor is unlikely. Ryan Propper doesn't always report his ECM work, so I would not be surprised to learn a full T70 or more has been completed. Similarly, I would be quite surprised if a factor below 69 digits turns up for this number. 
M1277 is 385 decimal digits long. It may have a factor less than 100 digits in length, then again, it may not. Anyone could stab at it with ECM for a very long time and find nothing. Someone here made mention of using SNFS. [I]YAFU[/I] does SNFS, but it may not be capable of handling an input this large. So, for now, we wait.

[QUOTE=storm5510;560628] So, for now, we wait.[/QUOTE]
Wait for what? We haven't done nearly enough ECM to justify the time SNFS would take. You can wait for others to do the ECM, or you can contribute if you wish. I've done both, myself I ran a couple CPUyears of ECM, and have waited since. I have a largememory machine available now, so perhaps I'll restart a little largebound ECM and contribute more than just posts. 
[QUOTE=VBCurtis;560605]LaurV is just trolling.[/QUOTE]
I am not. hihi. :w00t: I said in every post, sometimes twice, that the TF is not indicated/recommended/wanted. I just did a comparison run between factor5 (a compiled, reasonable optimized toy, that could, in theory, factor this exponent) and the OP script (isn't this what he requested?). 
[QUOTE=VBCurtis;560649]I've done both, myself[/QUOTE]
Me too, for M[M]1061[/M]. When the SNFS factors came out, I have seen how futile my ECM and "tremendous high P1" was (I don't remember the exact B1/B2 values, but they should be on PrimeNet DB). So, now I am healed of considering myself lucky (well... in this domain :smile:). Probabilistic, the smallest factor is somewhere at 140+ digits, sooooo... 
[QUOTE=VBCurtis;560649]Wait for what?
We haven't done nearly enough ECM to justify the time SNFS would take. You can wait for others to do the ECM, or you can contribute if you wish. I've done both, myself I ran a couple CPUyears of ECM, and have waited since. I have a largememory machine available now, so perhaps I'll restart a little largebound ECM and contribute more than just posts.[/QUOTE]You've got to ask yourself one question. Do I feel lucky? Well, do ya, punk? 
All times are UTC. The time now is 05:46. 
Powered by vBulletin® Version 3.8.11
Copyright ©2000  2021, Jelsoft Enterprises Ltd.