mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > Msieve

Reply
 
Thread Tools
Old 2017-01-26, 09:31   #1
cardmaker
 
Jun 2016

24 Posts
Default filtering wants 1000000 more relations

hello, my log always says " filtering wants 1000000 more relations" it reach 106% any ideia how much it will need for complete?

Found 60439089 relations, 106.0% of the estimated minimum (57029236).
-> ./msieve -s /root/ggnfs2/example.dat -l /root/ggnfs2/example.log -i /root/ggnfs2/example.ini -nf /root/ggnfs2/example.fb -t 6 -nc1
Msieve v. 1.52 (SVN Nicht versioniertes Verzeichnis)
random seeds: cb5f51c8 ad2d4743
factoring 9410261707528438928682935893154405638512521601548766718693569524548541134957386922116578082219341332079709851929749166485431038792833629490066386450067973 (154 digits)
no P-1/P+1/ECM available, skipping
commencing number field sieve (154-digit input)
R0: -2569588102508675989901913592056
R1: 59967557186607037
A0: 68567737430015540290801154102634743946145
A1: 2387684910178899757671107843137621
A2: -227045675501690108768510143
A3: 811554831654864911
A4: 40211154894
A5: 84
skew 76359603.48, size 4.749e-15, alpha -6.976, combined = 2.791e-12 rroots = 5

commencing relation filtering
estimated available RAM is 8192.0 MB
commencing duplicate removal, pass 1
found 23505935 hash collisions in 60439088 relations
added 5 free relations
commencing duplicate removal, pass 2
found 33829455 duplicates and 26609638 unique relations
memory use: 362.4 MB
reading ideals above 31916032
commencing singleton removal, initial pass
memory use: 753.0 MB
reading all ideals from disk
memory use: 460.5 MB
commencing in-memory singleton removal
begin with 26609638 relations and 33013473 unique ideals
reduce to 2004622 relations and 1022642 ideals in 26 passes
max relations containing the same ideal: 9
reading ideals above 100000
commencing singleton removal, initial pass
memory use: 94.1 MB
reading all ideals from disk
memory use: 84.3 MB
commencing in-memory singleton removal
begin with 2004648 relations and 4327199 unique ideals
reduce to 70 relations and 11 ideals in 5 passes
max relations containing the same ideal: 2
filtering wants 1000000 more relations
elapsed time 00:11:51
LatSieveTime: 20723.4
-> making sieve job for q = 32000000 in 32000000 .. 32025000 as file /root/ggnfs2/example.job.T0
-> making sieve job for q = 32000000 in 32000000 .. 32050000 as file /root/ggnfs2/example.job.T1
-> making sieve job for q = 31950000 in 31950000 .. 31962500 as file /root/ggnfs2/example.job.T2
-> making sieve job for q = 31950000 in 31950000 .. 31975000 as file /root/ggnfs2/example.job.T3
-> making sieve job for q = 31950000 in 31950000 .. 31987500 as file /root/ggnfs2/example.job.T4
-> making sieve job for q = 31950000 in 31950000 .. 32000000 as file /root/ggnfs2/example.job.T5
-> Lattice sieving algebraic q from 31950000 to 32050000.
-> gnfs-lasieve4I14e -k -o spairs.out.T0 -v -n0 -a /root/ggnfs2/example.job.T0
-> gnfs-lasieve4I14e -k -o spairs.out.T1 -v -n1 -a /root/ggnfs2/example.job.T1
-> gnfs-lasieve4I14e -k -o spairs.out.T2 -v -n2 -a /root/ggnfs2/example.job.T2
-> gnfs-lasieve4I14e -k -o spairs.out.T3 -v -n3 -a /root/ggnfs2/example.job.T3
-> gnfs-lasieve4I14e -k -o spairs.out.T4 -v -n4 -a /root/ggnfs2/example.job.T4
-> gnfs-lasieve4I14e -k -o spairs.out.T5 -v -n5 -a /root/ggnfs2/example.job.T5
cardmaker is offline   Reply With Quote
Old 2017-01-26, 11:01   #2
axn
 
axn's Avatar
 
Jun 2003

19·271 Posts
Default

Quote:
Originally Posted by cardmaker View Post
found 33829455 duplicates and 26609638 unique relations
Something has gone wrong with your run. The expected number of duplicates is on the order of 20% of the total, so for 60M relation collected, maybe about 10-15M duplicates wouldn't be unusual. But in your case, it is 34M, and you have only 26M unique. You'll probably need about 40M unique to build the matrix.

Keep going, and keep an eye on the unique relations count. Eventually you'll succeed.

EDIT:-
Quote:
-> making sieve job for q = 32000000 in 32000000 .. 32025000 as file /root/ggnfs2/example.job.T0
-> making sieve job for q = 32000000 in 32000000 .. 32050000 as file /root/ggnfs2/example.job.T1
-> making sieve job for q = 31950000 in 31950000 .. 31962500 as file /root/ggnfs2/example.job.T2
-> making sieve job for q = 31950000 in 31950000 .. 31975000 as file /root/ggnfs2/example.job.T3
-> making sieve job for q = 31950000 in 31950000 .. 31987500 as file /root/ggnfs2/example.job.T4
-> making sieve job for q = 31950000 in 31950000 .. 32000000 as file /root/ggnfs2/example.job.T5
T0 & T1 are sieving same (overlapping) range.
T2, T3, T4 & T5 are sieving the same (overlapping) range.
No wonder you're getting all these duplicates.
Someone more familiar with the factorization script will have to guide you how to escape from this SNAFU.

Last fiddled with by axn on 2017-01-26 at 11:05
axn is offline   Reply With Quote
Old 2017-01-26, 16:58   #3
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

19·263 Posts
Default

OP restarted the script with a different number of cores assigned than the original run. The script does not adjust for this, so since the restart the tasks have been duplicating effort (as axn showed you).

The cure is to rewrite by hand the checkpoint file, called something like resume.job. Stop the script, edit the file to make each of the 6 q-ranges separate but covering the same total range of q (looks like T1 and T5 with 50k blocks cover the entire region the script intended to sieve, so total q is 100k per block?). I believe you originally started it with two threads, then changed to 6, but that doesn't really matter.

Msieve filtering doesn't estimate how many more relations are needed; if filtering fails, the message you got is what is printed (filtering needs 1000000 more relations). Given your mistake, you're probably about two-thirds through the factorization.
VBCurtis is offline   Reply With Quote
Old 2017-01-26, 21:32   #4
cardmaker
 
Jun 2016

1016 Posts
Default

hello, many thanks both reply.
yes i have started whit 2 cores and changed for 6 , should be better let it continue or make the changes like you said?
cardmaker is offline   Reply With Quote
Old 2017-01-26, 21:35   #5
wombatman
I moo ablest echo power!
 
wombatman's Avatar
 
May 2013

23×223 Posts
Default

Make the changes. Otherwise, you're doing the computational equivalent of spinning your tires in mud.
wombatman is offline   Reply With Quote
Old 2017-01-30, 16:52   #6
chris2be8
 
chris2be8's Avatar
 
Sep 2009

7·311 Posts
Default

Also you could remove most of the duplicates by:
Code:
sort -ur example.dat > example.sorted
mv example.dat example.dat.original
mv example.sorted example.dat
But only when the script is not running.

Chris
chris2be8 is offline   Reply With Quote
Old 2017-01-30, 23:58   #7
EdH
 
EdH's Avatar
 
"Ed Hall"
Dec 2009
Adirondack Mtns

400510 Posts
Default

Quote:
Originally Posted by chris2be8 View Post
Also you could remove most of the duplicates by:
Code:
sort -ur example.dat > example.sorted
mv example.dat example.dat.original
mv example.sorted example.dat
But only when the script is not running.

Chris
There's also a program called remdups4 that removes duplicates and bad relations, that I use, but I don't know the source anymore.
EdH is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
NFS filtering error... Stargate38 YAFU 4 2016-04-20 16:53
The big filtering bug strikes again (I think) Dubslow Msieve 20 2016-02-05 14:00
Filtering Sleepy Msieve 25 2011-08-04 15:05
Filtering R.D. Silverman Cunningham Tables 14 2010-08-05 08:30
More relations mean many more relations wanted fivemack Factoring 7 2007-08-04 17:32

All times are UTC. The time now is 03:34.


Fri Oct 22 03:34:36 UTC 2021 up 90 days, 22:03, 1 user, load averages: 1.61, 1.81, 1.70

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.