- **Factoring**
(*https://www.mersenneforum.org/forumdisplay.php?f=19*)

- - **About GNFS and size of smooth numbers**
(*https://www.mersenneforum.org/showthread.php?t=19806*)

About GNFS and size of smooth numbersHello,
I am new to GNFS and I am trying to understand how the polynomial selection affects to the complexity of the whole algorithm. I read that GNFS is fastest than MPQS for huge numbers because GNFS has to find smooth numbers of size about N^(1/d), being d the order of the polynomial. I read also that finding a good polynomial is a time consuming operation, but I don't understand why is not enough for improve the speed of the algorithm the selection of a high value of d. My intuition says that with a polynomial with a very high d we will have to find small smooth numbers and this is much fast that finding smoth numbers of order N^(1/3), for example. Where is my error? why is not possible build a fast GNFS with high d? I appreciate your help to understand this issue. Thanks so much! |

[QUOTE=JuanraG;386778]Hello,
I am new to GNFS and I am trying to understand how the polynomial selection affects to the complexity of the whole algorithm. [/QUOTE] Have you actually read any articles about how it works? Polynomial selection affects the o(1) term in the L(N,c) function. [QUOTE] I read that GNFS is fastest than MPQS for huge numbers because GNFS has to find smooth numbers of size about N^(1/d), being d the order of the polynomial. [/QUOTE] You misread. Each relation consists of TWO smooth numbers. One of which is indeed about N(1/d). The other number is about P^d where P is an 'average' coordinate for a lattice point. While N^1/d does reduce with increasing d, the other side increases (even faster) as d increases. One finds a value of d such that the probability of both being smooth SIMULTANEOUSLY is as large as possible. You are also confusing "size" with "magnitude". Try Google. [QUOTE] I read also that finding a good polynomial is a time consuming operation, but I don't understand why is not enough for improve the speed of the algorithm the selection of a high value of d. [/QUOTE] I explain this just above. [QUOTE] My intuition says that with a polynomial with a very high d we will have to find small smooth numbers and this is much fast that finding smoth numbers of order N^(1/3), for example. Where is my error? [/QUOTE] Error #1: Assuming you know enough to HAVE an intuition. Classic Dunning and Kruger. Error #2: Not understanding the algorithm. [QUOTE] why is not possible build a fast GNFS with high d? [/QUOTE] Who says that it isn't? The optimal d increases with N. One [b]wants[/b] large d for very large N. The difficulty, of course, is that the amount of computation required for such N is beyond computer capabilities for the immediate future. May I suggest that before you return to this discussion that you actually read some articles about how NFS works? There are some excellent surveys available on the net. |

Wow!
I am sorry. I am not mathematician and it is hard for me to understand GNFS, even with all the surveys available on the net. I thought I could get some answers here but maybe this is a too advanced place for me. In any case, thank you for your explanation. It is clear to me now. Thaks so much. |

[QUOTE=JuanraG;386804]Wow!
I am sorry. I am not mathematician and it is hard for me to understand GNFS, even with all the surveys available on the net. I thought I could get some answers here but maybe this is a too advanced place for me. In any case, thank you for your explanation. It is clear to me now. Thaks so much.[/QUOTE] Learn how QS works first. |

The larger you choose d, the larger the algebraic polynomial degree. Each relation requires choosing (a,b) coordinates to plug into that larger polynomial, and as d increases the resulting values increase in size more and more quickly as the (a,b) values move away from the origin. So you essentially have a limited region in the (a,b) plane where the average size of the polynomial values are small enough that smooth relations are found often. As the number to be factored increases in size, the size of that nice region also increases and can support a (very slightly) larger d.
Thus, there is an optimal value of d that balances the rapidly increasing size of polynomial values with the temporary improvement in the smoothness probability. Many online papers go through the mathematics needed to determine that optimal d. |

[QUOTE=JuanraG;386804]Wow!
I am sorry. I am not mathematician and it is hard for me to understand GNFS, even with all the surveys available on the net. I thought I could get some answers here but maybe this is a too advanced place for me. In any case, thank you for your explanation. It is clear to me now. Thaks so much.[/QUOTE] You asked a good beginner question in a forum where people can answer very advanced questions. It just-so happens that one of the true experts was first to answer your question, and his answers, while accurate and enlightening, usually come with a judgement that you could have found your answer on your own with a mere 50 or so pages of reading academic papers. Do not be offended, please. The answer you received illuminated something I hadn't considered about poly degree- that optimal degree depends on input size, not on difficulty. That's why SNFS moves to degree 6 around 200 digits difficulty, just like GNFS does. If I recall the table from the published literature, does that mean degree 7 might be better than degree 6 for SNFS jobs around 350 digits? Anyway, thanks for asking, 'cause I learned something too. |

[QUOTE=VBCurtis;386823]You asked a good beginner question in a forum where people can answer very advanced questions. It just-so happens that one of the true experts was first to answer your question, and his answers, while accurate and enlightening, usually come with a judgement that you could have found your answer on your own with a mere 50 or so pages of reading academic papers. Do not be offended, please.
[/QUOTE] And in typical fashion, someone resorts to hyperbole in response to one of my posts. One need not read 50 pages of academic papers. 5 pages of a Wikipedia article would have sufficed. |

[QUOTE=R.D. Silverman;386845]5 pages of a Wikipedia article would have sufficed.[/QUOTE]
That assumes they know which Wikipedia article to read. Gently pointing him to [url]http://en.wikipedia.org/wiki/Quadratic_sieve[/url] [url]http://en.wikipedia.org/wiki/Special_number_field_sieve[/url] and [url]http://en.wikipedia.org/wiki/General_number_field_sieve[/url] would be better for a beginner. Chris |

All times are UTC. The time now is 02:57. |

Powered by vBulletin® Version 3.8.11

Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.