![]() |
[QUOTE=swellman;578880][QUOTE]What target_density did you specify when starting msieve? Or did you just go with the default value (TD=90)?
[/QUOTE] There appears to be plenty of relations for a job this size but who knows with the recent tsunami resulting from the pentathlon.[/QUOTE] I initially tried with TD 110 then 100 and tried your suggestion of 90 but still no luck. Please see complete log below [CODE]Sun May 23 08:56:52 2021 Msieve v. 1.54 (SVN Unversioned directory) Sun May 23 08:56:52 2021 random seeds: 28b757c0 17553340 Sun May 23 08:56:52 2021 factoring 2554492351743203553466465047163590561279014056908519070000000685201516774421702793327336915507089727958089217742945779099282127536202744456448278526479422651 (157 digits) Sun May 23 08:56:52 2021 searching for 15-digit factors Sun May 23 08:56:52 2021 commencing number field sieve (157-digit input) Sun May 23 08:56:52 2021 R0: -2530893774868872527835187432854 Sun May 23 08:56:52 2021 R1: 11653837865472407 Sun May 23 08:56:52 2021 A0: -39861548823757403003701316883289116185 Sun May 23 08:56:52 2021 A1: -57577200590567430222475767684207 Sun May 23 08:56:52 2021 A2: 49851357607445266131094441 Sun May 23 08:56:52 2021 A3: -11633836605299727549 Sun May 23 08:56:52 2021 A4: -1087030755260 Sun May 23 08:56:52 2021 A5: 24600 Sun May 23 08:56:52 2021 skew 1.00, size 2.933e-015, alpha -6.955, combined = 1.490e-014 rroots = 3 Sun May 23 08:56:52 2021 Sun May 23 08:56:52 2021 commencing relation filtering Sun May 23 08:56:52 2021 setting target matrix density to 90.0 Sun May 23 08:56:52 2021 estimated available RAM is 32684.7 MB Sun May 23 08:56:52 2021 commencing duplicate removal, pass 1 Sun May 23 08:57:50 2021 error -9 reading relation 8703777 Sun May 23 08:59:06 2021 error -15 reading relation 20287478 Sun May 23 08:59:06 2021 error -1 reading relation 20287479 Sun May 23 08:59:12 2021 error -15 reading relation 21183124 Sun May 23 08:59:23 2021 error -15 reading relation 22894559 Sun May 23 08:59:47 2021 error -1 reading relation 26549788 Sun May 23 09:00:00 2021 error -15 reading relation 28544066 Sun May 23 09:00:01 2021 error -5 reading relation 28689508 Sun May 23 09:00:22 2021 error -15 reading relation 31884952 Sun May 23 09:00:28 2021 error -5 reading relation 32762194 Sun May 23 09:00:36 2021 error -15 reading relation 34019853 Sun May 23 09:00:39 2021 error -9 reading relation 34464854 Sun May 23 09:00:55 2021 error -15 reading relation 36910851 Sun May 23 09:00:58 2021 error -1 reading relation 37299447 Sun May 23 09:01:01 2021 error -9 reading relation 37751586 Sun May 23 09:01:05 2021 error -9 reading relation 38346762 Sun May 23 09:01:05 2021 error -15 reading relation 38426769 Sun May 23 09:01:07 2021 error -15 reading relation 38581842 Sun May 23 09:01:18 2021 error -15 reading relation 40360219 Sun May 23 09:01:19 2021 error -15 reading relation 40438362 Sun May 23 09:01:24 2021 error -5 reading relation 41275013 Sun May 23 09:01:31 2021 error -15 reading relation 42357120 Sun May 23 09:01:40 2021 error -9 reading relation 43652773 Sun May 23 09:02:02 2021 error -9 reading relation 46922792 Sun May 23 09:02:05 2021 error -15 reading relation 47458791 Sun May 23 09:02:09 2021 error -15 reading relation 48082790 Sun May 23 09:02:10 2021 error -9 reading relation 48232176 Sun May 23 09:02:11 2021 error -9 reading relation 48300469 Sun May 23 09:02:12 2021 error -15 reading relation 48527723 Sun May 23 09:02:15 2021 error -15 reading relation 48980222 Sun May 23 09:02:18 2021 error -15 reading relation 49355006 Sun May 23 09:02:19 2021 error -9 reading relation 49628507 Sun May 23 09:02:37 2021 skipped 6 relations with composite factors Sun May 23 09:02:37 2021 found 11080480 hash collisions in 52182521 relations Sun May 23 09:03:02 2021 commencing duplicate removal, pass 2 Sun May 23 09:03:51 2021 found 11910833 duplicates and 40271688 unique relations Sun May 23 09:03:51 2021 memory use: 330.4 MB Sun May 23 09:03:51 2021 reading ideals above 720000 Sun May 23 09:03:51 2021 commencing singleton removal, initial pass Sun May 23 09:08:02 2021 memory use: 1378.0 MB Sun May 23 09:08:02 2021 reading all ideals from disk Sun May 23 09:08:02 2021 memory use: 1465.2 MB Sun May 23 09:08:04 2021 keeping 44409030 ideals with weight <= 200, target excess is 207753 Sun May 23 09:08:07 2021 commencing in-memory singleton removal Sun May 23 09:08:09 2021 begin with 40271688 relations and 44409030 unique ideals Sun May 23 09:08:31 2021 reduce to 17871104 relations and 18866792 ideals in 23 passes Sun May 23 09:08:31 2021 max relations containing the same ideal: 120 Sun May 23 09:08:32 2021 filtering wants 1000000 more relations Sun May 23 09:08:32 2021 elapsed time 00:11:40[/CODE] |
87777_231
Jarod -
I bumped Q up to 50M. Let us know if that’s not enough. |
[QUOTE=swellman;578887]Jarod -
I bumped Q up to 50M. Let us know if that’s not enough.[/QUOTE] Thanks for adding another 50M to my number. I will let you know if I need more added to it I have to agree the pentathlon has had noticeable effects i.e. stopping results for smaller numbers been sent out for a day or more. I hope it was successful for everybody involved. |
Taking 192__959_7m1
|
165__521_7m1 factored
1 Attachment(s)
[CODE]p85 factor: 3409575893253912524635682174516651564138258850111394793193161673694663599562990587431
p96 factor: 663214323305973943211842400136876607237065311218851734774123576155158122100768458305305159371947[/CODE] Approximately 7 hours on 6 threads of a Core i7-10510U with 12 GB memory for a 3.16M matrix at TD=100. Log attached and at [URL="https://pastebin.com/pnSL8EUc"]https://pastebin.com/pnSL8EUc[/URL] Factors added to factordb. |
[QUOTE=Jarod;578886]
[CODE]Sun May 23 09:02:37 2021 skipped 6 relations with composite factors Sun May 23 09:02:37 2021 found 11080480 hash collisions in [B][COLOR="Red"]52182521[/COLOR][/B] relations Sun May 23 09:03:02 2021 commencing duplicate removal, pass 2 Sun May 23 09:03:51 2021 found 11910833 duplicates and 40271688 unique relations Sun May 23 09:03:51 2021 memory use: 330.4 MB[/CODE][/QUOTE] Looks like you are missing part of the dataset file. At the time, the server was reporting about 65M rels but Msieve only read in 52M. If you re-download the new dataset it will have about 75M records. This will cause an over-sieve condition, which will need to be addressed in a different way. |
[QUOTE=RichD;578933]Looks like you are missing part of the dataset file. At the time, the server was reporting about 65M rels but Msieve only read in 52M. If you re-download the new dataset it will have about 75M records. This will cause an over-sieve condition, which will need to be addressed in a different way.[/QUOTE]
I suspect a portion of the generated relations never got into the data file during hurricane Pentathlon but you might be right. Jarod - just watch for a lot of relations (i.e. an oversieved condition) when you next attempt this job. If so, use the filter_maxrels= flag when invoking msieve. [CODE] msieve -v -nc “target_density=110 filter_maxrels=65000000” -t 4 [/CODE] Or similar (if needed). |
1 Attachment(s)
[QUOTE=swellman;578887]Jarod -
I bumped Q up to 50M. Let us know if that’s not enough.[/QUOTE] I have a feeling more rels needed to be added. There are 582,937 rels remaining to come in. Please see attached zip of several failed runs most of them TD 90 Thanks |
1 Attachment(s)
[QUOTE=Jarod;579017]I have a feeling more rels needed to be added. There are 582,937 rels remaining to come in. Please see attached zip of several failed runs most of them TD 90
Thanks[/QUOTE] Jarod - I just downloaded and got this job into LA. See attached log file. I have stopped the job now - it is yours to factor. But there are sufficient relations now. Suggest you delete the current .dat file from your local machine and redownload it. You may want to limit the number of rels handled in filtering (as discussed earlier in this thread) to avoid an oversieved condition. |
[QUOTE=swellman;579053]Jarod -
I just downloaded and got this job into LA. See attached log file. I have stopped the job now - it is yours to factor. But there are sufficient relations now. Suggest you delete the current .dat file from your local machine and redownload it. You may want to limit the number of rels handled in filtering (as discussed earlier in this thread) to avoid an oversieved condition.[/QUOTE] Thanks for the information. I will give it a go again in 6 or 7 hours. I wasn't able to find any commands on how to "limit the number of rels handled in filtering" (in the discussion earlier in this thread) if easier feel free to send me a PM thanks |
[QUOTE=Jarod;579090]Thanks for the information. I will give it a go again in 6 or 7 hours. I wasn't able to find any commands on how to "limit the number of rels handled in filtering" (in the discussion earlier in this thread) if easier feel free to send me a PM
thanks[/QUOTE] It’s contained [URL="https://www.mersenneforum.org/showpost.php?p=578937&postcount=117"]here[/URL]. |
All times are UTC. The time now is 01:47. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.