![]() |
It says "couldn't open stx for reading" for me when i redirect output to text file and open it in notepad++
|
[QUOTE=kotenok2000;610641]It says "couldn't open stx for reading" for me when i redirect output to text file and open it in notepad++[/QUOTE]Not in my case, redirected output is indeed an empty-string filename between two spaces.
|
This is in 2.07 so may have already been fixed, but I noticed something odd this morning:[quote]Starting factorization of 120429365390603351499490655344463033901953027604028644082960952575362981787169556144813176219675060180597674664423811465923470403
input indicated to have been pretested to t43.01 current ECM pretesting depth: 43.01 scheduled 2178 curves at [color=red]B1=0[/color] toward target pretesting depth of 43.01[/quote] |
[QUOTE=James Heinrich;610686]This is in 2.07 so may have already been fixed, but I noticed something odd this morning:[/QUOTE]
That's a rounding problem: try telling it that the input has been pretested to t43.02 or t44 |
[QUOTE=BudgieJane;610716]That's a rounding problem: try telling it that the input has been pretested to t43.02 or t44[/QUOTE]It might be a rounding issue if it just wants to run some more curves. My issue is with the curves being run at [c]B1=0[/c].
|
[QUOTE=bsquared;605289]I'm liking this quite a bit, hopefully it can be useful![/QUOTE]Since the default YAFU [c]factor.log[/c] output is both quite verbose and hard-to-parse, and the upcoming [c]factor.json[/c] format is compact and easy-to-parse, I wrote a PHP script to convert [c]factor.log[/c] to [c]factor.json[/c]
Usage: [c]php yafu-log2json.php factor.log > factor.json[/c] Please let me know if you come across a log sample where this fails. [code]<?php // YAFU factor.log to JSON format converter // written by: James Heinrich <james@mersenne.ca> // v0.1.0-202208041146 - initial release // v0.1.1-202208061639 - zero-width-assertions not working correctly in some cases // usage: php yafu-log2json.php factor.log > factor.json if ((count($_SERVER['argv']) != 2) || !preg_match('#\\.log$#i', $_SERVER['argv'][1])) { trigger_error('Usage: php '.basename(__FILE__).' factor.log'."\n", E_USER_ERROR); } elseif (!file_exists($_SERVER['argv'][1])) { trigger_error($_SERVER['argv'][1].' does not exist'."\n", E_USER_ERROR); } $rawlog = file_get_contents($_SERVER['argv'][1]); function YAFUfactorLogParse(&$rawlog) { // first look for nested factorizations of composites, and replace those blocks with the prime factors only $composites = array(); if (preg_match_all('#^(.+), c([0-9]+) = ([0-9]+)[ \r\n]#im', $rawlog, $matchset, PREG_PATTERN_ORDER)) { foreach ($matchset[3] as $composite) { $composites[$composite] = $composite; } ksort($composites); foreach ($composites as $composite) { if (preg_match('#^([^\\n]+)Starting factorization of '.$composite.'(.+)Total factoring time = [0-9\\.]+ seconds#imsU', $rawlog, $matches)) { if (preg_match_all('# p(rp)?([0-9]+) = ([0-9]+)#i', $matches[2], $matches2)) { list($dummy, $dummy, $dummy, $prime_factors) = $matches2; if (FactorsAddUp($composite, $prime_factors)) { $composites[$composite] = implode('*', $prime_factors); $rawlog = str_replace($matches[0], '', $rawlog); if (preg_match('#^([^\\n]+), c'.strlen($composite).' = '.$composite.'#im', $rawlog, $matches3)) { $prime_replacements = array(); foreach ($prime_factors as $prime_factor) { $prime_replacements[] = $matches3[1].', prp'.strlen($prime_factor).' = '.$prime_factor; } $rawlog = str_replace($matches3[0], implode(PHP_EOL, $prime_replacements), $rawlog); } } else { trigger_error('!FactorsAddUp('.$composite.', '.implode('*', $prime_factors).')'."\n", E_USER_ERROR); } } } } } foreach ($composites as $composite => $composite_factored) { if ($composite == $composite_factored) { trigger_error('Unfactored Composite: '.$composite."\n", E_USER_ERROR); } } // split out each factorization attempt, skipping any aborted attempts if (preg_match_all('#^([^\\n]+)Starting factorization of ([0-9]+)((?!Starting factorization of).*)Total factoring time = ([0-9\\.]+) seconds#imsU', $rawlog, $matchset, PREG_PATTERN_ORDER)) { // negative-string-match https://stackoverflow.com/questions/406230/ unset($matchset[1], $matchset[2], $matchset[3]); // save memory $JSON = array(); foreach ($matchset[0] as $matchnum => $logtext) { if (preg_match('#^(.+), Starting factorization of ([0-9]+)#i', $logtext, $matches)) { list($dummy, $timestamp_computer, $composite) = $matches; $datetime_start = YAFUtimestampParse($timestamp_computer); // div: found prime factor = 3 // prp16 = 6490148442004687 (curve 6 stg2 B1=11000 sigma=3972849619 thread=0) // prp32 = 15434893707426995319538878760499 $prime_factors = array(); if (preg_match_all('# div: found prime factor = ([0-9]+)#i', $logtext, $matches2)) { foreach ($matches2[1] as $prime_factor) { $prime_factors[] = $prime_factor; } } if (preg_match_all('# p(rp)?([0-9]+) = ([0-9]+)#i', $logtext, $matches2)) { foreach ($matches2[3] as $prime_factor) { $prime_factors[] = $prime_factor; } } if (!FactorsAddUp($composite, $prime_factors)) { trigger_error('!FactorsAddUp('.$composite.', '.implode('*', $prime_factors).')'."\n", E_USER_ERROR); } /* // https://www.mersenneforum.org/showthread.php?p=605289#post605289 { "input-expression":"factor(2^523-1)", "input-decimal":"27459190640522438859927603196325572869077741200573221637577853836742172733590624208490238562645818219909185245565923432148487951998866575250296113164460228607", "input-argument-string":"factor(2^523-1) -v -threads 32 -snfs_xover 70 -plan light ", "factors-prime":["160188778313202118610543685368878688932828701136501444932217468039063","171417691861249198128317096534322116476165056718630345094896620367860006486977101859504089"], "pm1-curves" : {"150000":1,"3750000":1}, "ecm-curves" : {"2000":256,"11000":256,"50000":256,"250000":256}, "ecm-levels" : {"t15":117.64,"t20":47.95,"t25":6.49,"t30":0.68,"t35":0.06}, "runtime" : {"total":1729.3813, "ecm":3.6256, "pm1":1.1629, "pp1":0.0000, "siqs":0.0000, "nfs-total":1720.8274, "nfs-poly":0.0000, "nfs-sieve":989.4496, "nfs-filter":405.3591, "nfs-la":175.4860, "nfs-sqrt":52.9520}, "time-start" : "2022-05-05 09:20:43", "time-end" : "2022-05-05 09:49:33", "info":{"compiler":"INTEL 2021","ECM-version":"7.0.4","GMP-version":"6.2.0","yafu-version":"2.08"} } */ $data = array( 'input-decimal' => $composite, 'factors-prime' => $prime_factors, 'runtime' => array('total'=> $matchset[4][$matchnum]), 'time-start' => $datetime_start, ); if (preg_match('#^.+, Total factoring time = ([0-9\\.]+) seconds#im', $logtext, $matches2)) { $data['time-end'] = YAFUtimestampParse($matches2[0]); } $elapsedTimes = array( 'siqs' => 'SIQS', // SIQS elapsed time = 5.3933 seconds. 'nfs-sqrt' => 'Sqrt', // Sqrt elapsed time = 0.0060 seconds. 'nfs' => 'NFS', // NFS elapsed time = 54998.4025 seconds. // Lanczos elapsed time = 0.6300 seconds. ); foreach ($elapsedTimes as $json_key => $text_string) { if (preg_match('#^.+, '.$text_string.' elapsed time = ([0-9\\.]+) seconds\\.#im', $logtext, $matches2)) { $data['runtime'][$json_key] = $matches2[1]; } } $JSON[] = json_encode($data); } else { trigger_error('Failed to match start of factorization in logtext #'.$matchnum."\n\n".$logtext."\n", E_USER_ERROR); } } return implode("\n", $JSON); } else { trigger_error('Failed to preg_match_all "Starting factorization .. Total factoring time" in $rawlog ('.number_format(strlen($rawlog)).' bytes)'."\n"); } return false; } function YAFUtimestampParse($line) { list($rawdate, $rawtime) = explode(' ', $line); return date('Y-m-d H:i:s', strtotime($rawdate.' '.$rawtime)); } function FactorsAddUp($bignumber, $factors) { $composite = 1; foreach ($factors as $factor) { $composite = gmp_mul($composite, $factor); } $composite = gmp_strval($composite); return ($composite == $bignumber); } echo YAFUfactorLogParse($rawlog)."\n";[/code] Caveat: I discovered a quirk in YAFU v1.34.5 that double-outputs prp factor lines when it discovers a "perfect power":[quote]Starting factorization of 3827960579805422062343214616570866403 **************************** prp13 = 1564308789787 prp13 = 1564308789787 prp13 = 1564308789787 input is a perfect power prp13 = 1564308789787 prp13 = 1564308789787 prp13 = 1564308789787[/quote]This case is not handled by my script since it appears YAFU v2.x no longer does this, you'll just get a factors-don't-match error if this is encountered. |
v. 2.09 available
Changes/fixes since 2.08 have been compiled into a new version 2.09 windows executable.
This includes the request to print NFS filtering ETA more verbosely, and also include the ETA in the logfile. The issue with B1=0 happens because that is apparently printed before yafu has figured out what curves to use when given an input -work level. Curves will not actually be run at B1=0. There are some other things I need to clean up when using -work, so I'm going to hold off on fixing that for now until I can address those things. In the meantime it is safe to ignore the B1=0 message. By the way, the Visual Studio version 1922 thing is real... that is the actual value of _MSC_VER. See [URL="https://docs.microsoft.com/en-us/cpp/preprocessor/predefined-macros?view=msvc-170"]Predefined Macros[/URL]. I've upgraded to VS2022 so that is now version 1931 :smile: |
[QUOTE=bsquared;611078]Changes/fixes since 2.08 have been compiled into a new version 2.09 windows executable.[/QUOTE]Thanks Ben! :cool:
Download link for those who don't want to dig: [url]https://github.com/bbuhrow/yafu/raw/master/bin/x64/Release/yafu-x64.exe[/url] Seems to be behaving as expected so far, running it through [c]tune[/c] on my i3 right now. (Unfortunately my main PC still runs Win7, and I haven't yet found the courage to try building it on my CentOS-8 server). |
[QUOTE=James Heinrich;610639]I tried to get batchfile processing working with YAFU 2.08 and it's not playing nice for me.[/QUOTE]This is still broken for me in 2.09:[code]C:\Users\User\Desktop\factordb>yafu-x64.exe -batchfile random_composites.txt
Applying tune_info entry for WIN64 - Intel(R) Core(TM) i3-8100 CPU @ 3.60GHz no variable indicator (@): interpreting batchfile lines as input expressions YAFU Version 2.09 Built with Microsoft Visual Studio 1931 Using GMP-ECM 7.0.4, Powered by MPIR 3.0.0 Detected Intel(R) Core(TM) i3-8100 CPU @ 3.60GHz Detected L1 = 32768 bytes, L2 = 6291456 bytes, CL = 64 bytes Using 1 random witness for Rabin-Miller PRP checks Cached 664579 primes; max prime is 9999991 Parsed yafu.ini from C:\Users\User\Desktop\factordb =============================================================== ======= Welcome to YAFU (Yet Another Factoring Utility) ======= ======= bbuhrow@gmail.com ======= ======= Type help at any time, or quit to quit ======= =============================================================== >> fopen error: No such file or directory couldn't open 5 for reading[/code]It's a [i]little[/i] more specific about what it can't open now, but I'm not sure why it's trying to open [c]5[/c] (the error message still doesn't include information about what [i]kind[/i] of file it's trying (and failing) to open -- is that the batchfile input, or something else like ini file, etc? |
Also, I see factor.json is output now, which is fantastic. But would it be possible, at least with an ini-option (although I think it should be default), to put the JSON all on one line rather than pretty-printed? It makes parsing a whole lot easier if you can assume that one line = one result. Yes they're both valid, but it's both easier to computer-parse and easier to human-parse too if you're scanning down a long list of many JSON results where things are at least somewhere column-aligned between results rather than with lots of formatted whitespace.
|
We have ran again into that issue where some parts/bytes/relations of a [U]compressed[/U] rels.dat file are screwed up and can not be extracted. Factoring the C156, from aliquot 181428, index 1032, there were like 44M relations needed, but we got suspicious as we expected this factorization end before last weekend, we looked inside the log files and found out that every time the number of needed relations was reached, yafu (2.0) tried filtering and came out empty handed, with "zero relations and zero ideals". It seemed it could only extract some relations from the beginning of the file, but not all of them, and it was thinking there are not enough relations, therefore "extending" the usual 5% again and again for the last 3 days. (!!)
It took us half of the night of gzip and hex editor to "fix" it. Actually, only 3 or 4 hours of handwork, because the other 1 or 2 hours before, were spent searching the web for a hex editor which was not expensive and it was able to open and edit that almost 9 GB (!!!) of a monster file, and last but not least, have reasonable search/replace functions. We installed Hex Editor Neo, which was free and worked like a charm. The process went like that: 1. decompress the file with gzip (caution! gzip deletes the input file if you don't use -k option or so, we didn't know that, and our face fell down when we saw the 9GB of relations Puff! disappearing from the folder! but luckily, we had a copy. Never work on the original file! So, we learned the gzip switches, with this occasion :razz: In fact, we are boasting a bit, we were even not so clever, but gzip only wanted to decompress a "gz" file so that caused us to make a copy first, and call it "rels.gz" hehe). 2. gzip will create a clear text file with relations decompressing the gz file for as long as it can decompress it, and then give a "we found some rubbish after the end of the file, which was ignored" error message. That is not rubbish! Man, that's just the [U]rest (most of)[/U] the relations! The first ~5M relations were recovered this way. 3. copy the resulted file in another folder and zip it - this will create some zip file which you don't need, but you need its [U]size[/U] 4. look into the rels.dat (well, now renamed rels.gz) with the hex editor around that offset, to find the rubbish that stopped the gzip (and yafu) when it tried to unpack the relations. You are looking for something like "1B 00 00 1F" or something like that (I can find the exact string if anybody is interested) around the position where the chunk terminates, which represents the starting of a new gzip chunk. These bytes can appear anywhere in the file, but they are not always starting of a new chunk. That is why you need the approximate size of the compressed relations that you just decompressed, i.e. that is why you need step 3. The way gzip (used in yafu) works, a new chunk is added to the rels.dat file after every range of q's are searched, so your rels.dat can contain many such chunks (like 50, or a hundred, or more), and if one is damaged, you need to delete it from the file, and the rest can still be unpacked, and the relations are still usable). 5. delete everything from the beginning of the file to that offset, and save the file like rels1.gz 6. repeat from step 1 (use a copy of rels1.gz, or use the gzip switch to keep the input file), as long as you still have relations to recover, i.e. as long as gzip tells you that some "rubbish" is present at the end of the file, it means it did not extract till the end. We successfully recovered [U]over 75M relations[/U] !! (only 44M were needed, but that "extending 5%" process went on for few days). Luckily, only took us 3 loops, so the file was damaged only in 2 places. We recovered about 40M relations the second loop and another about 30M in the third loop. We concatenated the files with copy /b and renamed the uncompressed resulted file "rels.dat". We launched yafu on this file and it immediately moved to filtering and after some more minutes, to LA which said it will take about 2-3 hours. We let it run and went to bed, it was already past 3:00 AM. In the morning, the beast was factored, and another few terms of the sequence were solved by ECM. Now that computer is working at index 1038, and it lost the 3 and two powers of 2 (i.e. index 1038 is 2^2*blah_blah*C145, currently in ECM), and the sequence decreased (yeah!). We still have the nfs.log if anybody is interested. So, again, [B]what was that trick to run yafu in such a way that it doesn't compress the rels.dat file?[/B] People nowadays have a lot of space, and fast SSDs. Also sometimes we would also like to add relations by hand (in case files for specific core i.e. rels0, rels1, etc, remain hanging), which we can not do to a compressed file... |
All times are UTC. The time now is 23:52. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.