mersenneforum.org  

Go Back   mersenneforum.org > Factoring Projects > CADO-NFS

Reply
 
Thread Tools
Old 2022-04-25, 21:48   #1
mathwiz
 
Mar 2019

11·31 Posts
Default CADO params and q range

I thought I would give factorization of a Homogeneous Cunningham number a shot using CADO. With yafu, I was able to generate this poly:

Code:
n: 201495490924117819350545278401521200186475080661991546785061151586481536921427801875903114757056584082618659771965549878871442628101978737724895837344401302384130776054773518400806340305333533348590615262877320461
# 11^290+9^290, difficulty: 241.60, anorm: 1.00e+24, rnorm: 2.52e+66
# scaled difficulty: 248.67, suggest sieving rational side
# size = 1.415e-24, alpha = 0.000, combined = 1.208e-14, rroots = 0
type: snfs
size: 241
skew: 1.0000
c4: 1
c3: -1
c2: 1
c1: -1
c0: 1
Y1: -22185312344622607535965183080365494317672538611578408721
Y0: 2516377186292711566730985912068419625116019959228909823321881
m: 101290116629115338085345720418305585355139554774185662713457738677742471190636591982077468281736585643907704579944243463029692128379469347600167114231320549388999388243465493646139160978801797490689940967854821899

rlim: 66313648
alim: 45537270
lpbr: 32
lpba: 30
mfbr: 92
mfba: 60
rlambda: 3.6
alambda: 2.6
What special-q range should I use sieve the CADO las binary? What would a sample command look like? I am trying, e.g.:

./c168.poly -q0 70020000 -q1 70030000 -I 15 -lim0 66313648 -lim1 45537270 -lpb0 32 -lpb1 30 -mfb0 92 -mfb1 60 -ncurves0 16 -ncurves1 10 -sqside 0 -fb1 ./c168.roots.gz -out out.1137320000.c168.gz -t 4 -stats-stderr
mathwiz is offline   Reply With Quote
Old 2022-04-26, 00:52   #2
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

7·829 Posts
Default

I haven't yet had a reason to invoke the siever directly- I just use the python wrapper and ./cado-nfs.py to launch.

Crafting an input file isn't too tough- see the demo file in the parameters/factor folder, I think the SNFS example is F9.

params.c90 has all the settings & explanations.

You likely already know all that, in which case I am of no use for command line- sorry.

I can suggest that unlike ggnfs sievers, there isn't much memory penalty for larger lims, so I'd consider making them up to 30% larger than the settings you'd choose for an nfs@home submission. And since CADO sieves below the factor base, a starting Q around half the size you'd pick for nfs@home is likely about right; I'd give a specific suggestion, but I have very very little experience with quartics.

Last fiddled with by VBCurtis on 2022-04-26 at 00:52
VBCurtis is offline   Reply With Quote
Old 2022-04-26, 03:08   #3
charybdis
 
charybdis's Avatar
 
Apr 2020

2×7×73 Posts
Default

Are you aware of the scale of this job? This is a quartic of difficulty 241. I would guess it is roughly comparable to an "ordinary" SNFS (i.e. a sextic) of difficulty in the high 270s, or GNFS around 190 digits. In other words, in the 5-10 CPU-year range. It would be a large job for the NFS@Home 15e queue.

The las invocation looks correct, as long as it is preceded by "las -poly " of course. The file names are a bit weird given that this is not a c168 and the output file name has nothing to do with the Q range. But I would highly recommend using the cado-nfs.py script as long as your setup allows it.

As for the parameters: test-sieving is essential for a job this size. My first guess would be lpb0=33, lpb1=32, mfb0=96, mfb1=64, lim0=200M, lim1=300M. Your ncurves1=10 may also be too low.
charybdis is offline   Reply With Quote
Old 2022-04-26, 22:55   #4
mathwiz
 
Mar 2019

34110 Posts
Default

Quote:
Originally Posted by charybdis View Post
Are you aware of the scale of this job? This is a quartic of difficulty 241. I would guess it is roughly comparable to an "ordinary" SNFS (i.e. a sextic) of difficulty in the high 270s, or GNFS around 190 digits. In other words, in the 5-10 CPU-year range. It would be a large job for the NFS@Home 15e queue.

The las invocation looks correct, as long as it is preceded by "las -poly " of course. The file names are a bit weird given that this is not a c168 and the output file name has nothing to do with the Q range. But I would highly recommend using the cado-nfs.py script as long as your setup allows it.

As for the parameters: test-sieving is essential for a job this size. My first guess would be lpb0=33, lpb1=32, mfb0=96, mfb1=64, lim0=200M, lim1=300M. Your ncurves1=10 may also be too low.
Thank you for your help. Two followup questions.

(1) Is there a better rule for the SNFS difficulty in the case of a quartic? The homogeneous cunningham page lists this as SNFS-241 roughly. Presumably it's calculating that as if deg-5 or deg-6 poly's were used?

(2) Is there a better poly for this composite besides a quartic? YAFU spits that out by default, and I didn't see an option in the docfile to force a poly of different degree. Specifically it prints:

Code:
nfs: number in job file does not match input
nfs: checking for poly file - no poly file found
nfs: commencing nfs on c303: 100897102794373146986330305183155343412663648077278012567849478322943160344321028382268159833546189882514680027235972756720459933503177145892099767412807649890883988121891682135812853596878468970915546421537329858142181661638574802731650404480828260882567990986429430237330187207954355271779816601181002
nfs: searching for brent special forms...
nfs: searching for homogeneous cunningham special forms...
nfs: input divides 11^290 + 9^290
nfs: guessing snfs difficulty 241 is roughly equal to gnfs difficulty 165
nfs: creating ggnfs job parameters for input of size 165
nfs: guessing snfs difficulty 241 is roughly equal to gnfs difficulty 165
nfs: creating ggnfs job parameters for input of size 165

gen: ========================================================
gen: selected polynomial:
gen: ========================================================

n: 201495490924117819350545278401521200186475080661991546785061151586481536921427801875903114757056584082618659771965549878871442628101978737724895837344401302384130776054773518400806340305333533348590615262877320461
# 11^290+9^290, difficulty: 241.60, anorm: 1.00e+24, rnorm: 2.52e+66
# scaled difficulty: 248.67, suggest sieving rational side
# size = 1.415e-24, alpha = 1.694, combined = 1.208e-14, rroots = 0
type: snfs
size: 241
skew: 1.0000
c4: 1
c3: -1
c2: 1
c1: -1
c0: 1
Y1: -22185312344622607535965183080365494317672538611578408721
Y0: 2516377186292711566730985912068419625116019959228909823321881
m: 101290116629115338085345720418305585355139554774185662713457738677742471190636591982077468281736585643907704579944243463029692128379469347600167114231320549388999388243465493646139160978801797490689940967854821899

Last fiddled with by mathwiz on 2022-04-26 at 23:08 Reason: show yafu output
mathwiz is offline   Reply With Quote
Old 2022-04-26, 23:36   #5
VBCurtis
 
VBCurtis's Avatar
 
"Curtis"
Feb 2005
Riverside, CA

580310 Posts
Default

Quote:
Originally Posted by mathwiz View Post
Thank you for your help. Two followup questions.

(1) Is there a better rule for the SNFS difficulty in the case of a quartic? The homogeneous cunningham page lists this as SNFS-241 roughly. Presumably it's calculating that as if deg-5 or deg-6 poly's were used?
I don't understand your question- 241 *is* the difficulty of this poly. It's just that quartics are much much harder than 5th or 6th degree polys at that size. "241" isn't a rating of how long the job will take, it's the actual length of the input when using the SNFS poly. There's no heuristic nor estimation involved.

If you browse the nfs@home listing for each queue, you'll see the largest quartic done with 15e is around this size. I didn't see any ~245 difficulty ones in the results, but I did see that 250-difficulty quartics are sent to f-small. All the f-small jobs and many of the 15e jobs are artificially limited for lim choice by memory requirements on BOINC- we would prefer larger lims like those Charybdis suggested. I'd use 32/34 bit LPs; quartics get really unbalanced norms at this size. If you did so, you'd need 1000-1050M raw relations. On CADO, you might choose to start with A=30 to improve yield on small Q, changing to I=15 after maybe 25% of the job is done.
VBCurtis is offline   Reply With Quote
Old 2022-04-27, 01:30   #6
charybdis
 
charybdis's Avatar
 
Apr 2020

2·7·73 Posts
Default

Curtis has answered question 1, so I'll answer question 2:

Quote:
Originally Posted by mathwiz View Post
(2) Is there a better poly for this composite besides a quartic? YAFU spits that out by default, and I didn't see an option in the docfile to force a poly of different degree.
The quartic polynomial for this number arises because the exponent 290 is divisible by 5, and so 11^290+9^290 has an algebraic factor 11^58+9^58. Dividing out this algebraic factor leaves you with the quartic that YAFU gave you; to see why it's degree 4, think what happens when you divide x^5+y^5 by x+y. Unfortunately there is no way to turn this into a degree 5 or 6 polynomial; it can be turned into one of degree 8, but at this size that's even worse than degree 4.

If you really wanted to use a sextic you'd have to factor the full 11^290+9^290 without the algebraic factor divided out. This would have difficulty 303, making the quartic look like a piece of cake in comparison.

There are lots of available Homogeneous Cunninghams that are much easier than 11+9_290. The smallest sextics are difficulty 262 and will take ~2 CPU-years, which is still large for an individual but much more reasonable than a difficulty-241 quartic. If you're still interested, I'd recommend getting experience with smaller jobs first, as everyone runs into unexpected issues and you'd much rather discover those on tasks that take hours rather than weeks or months. Try a GNFS-120, then a GNFS-140...
charybdis is offline   Reply With Quote
Old 2022-04-27, 15:46   #7
chris2be8
 
chris2be8's Avatar
 
Sep 2009

11×223 Posts
Default

The combined difficulty, 1.208e-14, would be the best indicator of how hard a job will be. Run time will be inversely proportional to it. But the constant varies with degree so you can't directly compare polys with different degree that way. And there's a bit of noise in run-times.

NB. alpha = 0.000 looks wrong. Putting the poly through msieve I got alpha 1.694, which looks more plausible.

PS. The combined difficulty is also called the e-score or Murphy e-score in other places. But they all refer to the same thing.

Last fiddled with by chris2be8 on 2022-04-27 at 15:49 Reason: Added PS.
chris2be8 is offline   Reply With Quote
Old 2022-04-27, 19:57   #8
swellman
 
swellman's Avatar
 
Jun 2012

2×1,993 Posts
Default

Another issue not discussed is the fact that 11+9,290 has not received enough ECM to meet due diligence. According to the ECMNet site, it’s seen only 6441 curves @B1=260e6. Probably needs another 10-15,000. ECM is not required per se, but running a long factorization through NFS only to produce a p51 would be awkward/embarrassing.

I’ll add a now old observation from NFS moderator and forum member @debroulx:

Quote:
* SNFS polynomials should have leading coefficients < 10^5
* SNFS tasks with degree 6 polynomials (preferred) should have 225 <= difficulty <= 250, ECM to 2/9 of SNFS difficulty
* SNFS degree 5 tasks should have 210 <= difficulty <= 235, ECM to 2/9 of (15+SNFS difficulty)
* SNFS degree 4 tasks should have 195 <= difficulty <= 220, ECM to 2/9 of (30+SNFS difficulty)
* GNFS tasks should have 155 <= difficulty <= 170, ECM to 2/7 of GNFS difficulty (a bit more than that for difficulty close to 170)
These are guidelines for posting on the 14d siever not hard rules of course, and I would argue the sieving difficulty of a quartic or quintic vs a sextic is perhaps more nuanced than a flat SNFS+30 or 15 but it conveys the right idea - quartics are hard!
swellman is online now   Reply With Quote
Old 2022-04-27, 20:22   #9
charybdis
 
charybdis's Avatar
 
Apr 2020

2·7·73 Posts
Default

Quote:
Originally Posted by chris2be8 View Post
The combined difficulty, 1.208e-14, would be the best indicator of how hard a job will be. Run time will be inversely proportional to it. But the constant varies with degree so you can't directly compare polys with different degree that way. And there's a bit of noise in run-times.
Yeah, quartics tend to run substantially faster than their score would suggest when compared to higher degree polys. But when the score is about the same as the record for 194-digit GNFS, that isn't saying much.
charybdis is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Improved params files for CADO VBCurtis CADO-NFS 105 2022-09-17 14:19
CADO help henryzz CADO-NFS 6 2022-09-13 23:11
CADO NFS Shaopu Lin CADO-NFS 522 2021-05-04 18:28
CADO-NFS skan Information & Answers 1 2013-10-22 07:00
CADO R.D. Silverman Factoring 4 2008-11-06 12:35

All times are UTC. The time now is 22:01.


Fri Jun 2 22:01:50 UTC 2023 up 288 days, 19:30, 0 users, load averages: 1.07, 1.00, 0.96

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔