![]() |
![]() |
#1 |
Mar 2019
11·31 Posts |
![]()
I thought I would give factorization of a Homogeneous Cunningham number a shot using CADO. With yafu, I was able to generate this poly:
Code:
n: 201495490924117819350545278401521200186475080661991546785061151586481536921427801875903114757056584082618659771965549878871442628101978737724895837344401302384130776054773518400806340305333533348590615262877320461 # 11^290+9^290, difficulty: 241.60, anorm: 1.00e+24, rnorm: 2.52e+66 # scaled difficulty: 248.67, suggest sieving rational side # size = 1.415e-24, alpha = 0.000, combined = 1.208e-14, rroots = 0 type: snfs size: 241 skew: 1.0000 c4: 1 c3: -1 c2: 1 c1: -1 c0: 1 Y1: -22185312344622607535965183080365494317672538611578408721 Y0: 2516377186292711566730985912068419625116019959228909823321881 m: 101290116629115338085345720418305585355139554774185662713457738677742471190636591982077468281736585643907704579944243463029692128379469347600167114231320549388999388243465493646139160978801797490689940967854821899 rlim: 66313648 alim: 45537270 lpbr: 32 lpba: 30 mfbr: 92 mfba: 60 rlambda: 3.6 alambda: 2.6 ./c168.poly -q0 70020000 -q1 70030000 -I 15 -lim0 66313648 -lim1 45537270 -lpb0 32 -lpb1 30 -mfb0 92 -mfb1 60 -ncurves0 16 -ncurves1 10 -sqside 0 -fb1 ./c168.roots.gz -out out.1137320000.c168.gz -t 4 -stats-stderr |
![]() |
![]() |
![]() |
#2 |
"Curtis"
Feb 2005
Riverside, CA
7·829 Posts |
![]()
I haven't yet had a reason to invoke the siever directly- I just use the python wrapper and ./cado-nfs.py to launch.
Crafting an input file isn't too tough- see the demo file in the parameters/factor folder, I think the SNFS example is F9. params.c90 has all the settings & explanations. You likely already know all that, in which case I am of no use for command line- sorry. I can suggest that unlike ggnfs sievers, there isn't much memory penalty for larger lims, so I'd consider making them up to 30% larger than the settings you'd choose for an nfs@home submission. And since CADO sieves below the factor base, a starting Q around half the size you'd pick for nfs@home is likely about right; I'd give a specific suggestion, but I have very very little experience with quartics. Last fiddled with by VBCurtis on 2022-04-26 at 00:52 |
![]() |
![]() |
![]() |
#3 |
Apr 2020
2×7×73 Posts |
![]()
Are you aware of the scale of this job? This is a quartic of difficulty 241. I would guess it is roughly comparable to an "ordinary" SNFS (i.e. a sextic) of difficulty in the high 270s, or GNFS around 190 digits. In other words, in the 5-10 CPU-year range. It would be a large job for the NFS@Home 15e queue.
The las invocation looks correct, as long as it is preceded by "las -poly " of course. The file names are a bit weird given that this is not a c168 and the output file name has nothing to do with the Q range. But I would highly recommend using the cado-nfs.py script as long as your setup allows it. As for the parameters: test-sieving is essential for a job this size. My first guess would be lpb0=33, lpb1=32, mfb0=96, mfb1=64, lim0=200M, lim1=300M. Your ncurves1=10 may also be too low. |
![]() |
![]() |
![]() |
#4 | |
Mar 2019
34110 Posts |
![]() Quote:
(1) Is there a better rule for the SNFS difficulty in the case of a quartic? The homogeneous cunningham page lists this as SNFS-241 roughly. Presumably it's calculating that as if deg-5 or deg-6 poly's were used? (2) Is there a better poly for this composite besides a quartic? YAFU spits that out by default, and I didn't see an option in the docfile to force a poly of different degree. Specifically it prints: Code:
nfs: number in job file does not match input nfs: checking for poly file - no poly file found nfs: commencing nfs on c303: 100897102794373146986330305183155343412663648077278012567849478322943160344321028382268159833546189882514680027235972756720459933503177145892099767412807649890883988121891682135812853596878468970915546421537329858142181661638574802731650404480828260882567990986429430237330187207954355271779816601181002 nfs: searching for brent special forms... nfs: searching for homogeneous cunningham special forms... nfs: input divides 11^290 + 9^290 nfs: guessing snfs difficulty 241 is roughly equal to gnfs difficulty 165 nfs: creating ggnfs job parameters for input of size 165 nfs: guessing snfs difficulty 241 is roughly equal to gnfs difficulty 165 nfs: creating ggnfs job parameters for input of size 165 gen: ======================================================== gen: selected polynomial: gen: ======================================================== n: 201495490924117819350545278401521200186475080661991546785061151586481536921427801875903114757056584082618659771965549878871442628101978737724895837344401302384130776054773518400806340305333533348590615262877320461 # 11^290+9^290, difficulty: 241.60, anorm: 1.00e+24, rnorm: 2.52e+66 # scaled difficulty: 248.67, suggest sieving rational side # size = 1.415e-24, alpha = 1.694, combined = 1.208e-14, rroots = 0 type: snfs size: 241 skew: 1.0000 c4: 1 c3: -1 c2: 1 c1: -1 c0: 1 Y1: -22185312344622607535965183080365494317672538611578408721 Y0: 2516377186292711566730985912068419625116019959228909823321881 m: 101290116629115338085345720418305585355139554774185662713457738677742471190636591982077468281736585643907704579944243463029692128379469347600167114231320549388999388243465493646139160978801797490689940967854821899 Last fiddled with by mathwiz on 2022-04-26 at 23:08 Reason: show yafu output |
|
![]() |
![]() |
![]() |
#5 | |
"Curtis"
Feb 2005
Riverside, CA
580310 Posts |
![]() Quote:
If you browse the nfs@home listing for each queue, you'll see the largest quartic done with 15e is around this size. I didn't see any ~245 difficulty ones in the results, but I did see that 250-difficulty quartics are sent to f-small. All the f-small jobs and many of the 15e jobs are artificially limited for lim choice by memory requirements on BOINC- we would prefer larger lims like those Charybdis suggested. I'd use 32/34 bit LPs; quartics get really unbalanced norms at this size. If you did so, you'd need 1000-1050M raw relations. On CADO, you might choose to start with A=30 to improve yield on small Q, changing to I=15 after maybe 25% of the job is done. |
|
![]() |
![]() |
![]() |
#6 | |
Apr 2020
2·7·73 Posts |
![]()
Curtis has answered question 1, so I'll answer question 2:
Quote:
If you really wanted to use a sextic you'd have to factor the full 11^290+9^290 without the algebraic factor divided out. This would have difficulty 303, making the quartic look like a piece of cake in comparison. There are lots of available Homogeneous Cunninghams that are much easier than 11+9_290. The smallest sextics are difficulty 262 and will take ~2 CPU-years, which is still large for an individual but much more reasonable than a difficulty-241 quartic. If you're still interested, I'd recommend getting experience with smaller jobs first, as everyone runs into unexpected issues and you'd much rather discover those on tasks that take hours rather than weeks or months. Try a GNFS-120, then a GNFS-140... |
|
![]() |
![]() |
![]() |
#7 |
Sep 2009
11×223 Posts |
![]()
The combined difficulty, 1.208e-14, would be the best indicator of how hard a job will be. Run time will be inversely proportional to it. But the constant varies with degree so you can't directly compare polys with different degree that way. And there's a bit of noise in run-times.
NB. alpha = 0.000 looks wrong. Putting the poly through msieve I got alpha 1.694, which looks more plausible. PS. The combined difficulty is also called the e-score or Murphy e-score in other places. But they all refer to the same thing. Last fiddled with by chris2be8 on 2022-04-27 at 15:49 Reason: Added PS. |
![]() |
![]() |
![]() |
#8 | |
Jun 2012
2×1,993 Posts |
![]()
Another issue not discussed is the fact that 11+9,290 has not received enough ECM to meet due diligence. According to the ECMNet site, it’s seen only 6441 curves @B1=260e6. Probably needs another 10-15,000. ECM is not required per se, but running a long factorization through NFS only to produce a p51 would be awkward/embarrassing.
I’ll add a now old observation from NFS moderator and forum member @debroulx: Quote:
|
|
![]() |
![]() |
![]() |
#9 | |
Apr 2020
2·7·73 Posts |
![]() Quote:
|
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Improved params files for CADO | VBCurtis | CADO-NFS | 105 | 2022-09-17 14:19 |
CADO help | henryzz | CADO-NFS | 6 | 2022-09-13 23:11 |
CADO NFS | Shaopu Lin | CADO-NFS | 522 | 2021-05-04 18:28 |
CADO-NFS | skan | Information & Answers | 1 | 2013-10-22 07:00 |
CADO | R.D. Silverman | Factoring | 4 | 2008-11-06 12:35 |