![]() |
![]() |
#1 |
Jun 2021
5110 Posts |
![]()
p- any composite number
for small p, inner loop may be less than 250, its heuristic) Code:
{p=1237*1234577; c=ceil(sqrt(p)); for(n=c,c+10000, b=lift(Mod(n^2,p));a=lift(Mod(b^2,p)); for(y=1,250, t=ceil(sqrt(b^2-a)); b=lift(Mod(t^2,p));a=lift(Mod(b^2,p)); if(b<c,break());); localprec(7); b=b/c/1.; if(b<.1,print(b);););} Roman |
![]() |
![]() |
![]() |
#2 |
Jun 2021
3×17 Posts |
![]()
Easy to see that t in this code is coordinate of the lower point of some "theeth of the saw" on the graf x^2 mod p. (t-1 upper)
In second cycle we just make jumps to next such "theeth" and go on. Curiosly, this process have two outcome - some closed loop, ring, and zero, or sub sqrt value of t^2 mod p one "jump" before. I dont known, new or old one this heuristic. But its look like a sieve for n values, and hypothetically can point out on the some dependency in x^2 mod p |
![]() |
![]() |
![]() |
#3 |
Jun 2021
3×17 Posts |
![]()
Most interesting, we can try the same method (saw jumps) for x^a mod p, when a>2, and (with hope) find such x, x^a mod p <p^(-a)
Dreams, only dreams of course)) The curse of sqrt too strong. Last fiddled with by RomanM on 2021-06-25 at 13:24 Reason: dreams |
![]() |
![]() |
![]() |
#4 |
Jun 2021
3·17 Posts |
![]()
Once again, Why does it work? (Personally I don't know yet))
Code:
\p300 {p=233108530344407544527637656910680524145619812480305449042948611968495918245135782867888369318577116418213919268572658314913060672626911354027609793166341626693946596196427744273886601876896313468704059066746903123910748277606548649151920812699309766587514735456594993207; c=ceil(sqrt(p)); for(n=1,p, u=c+n; \\initial values b=lift(Mod(u^2,p)); a=lift(Mod(b^2,p)); for(y=1,250, \\The riddle is here, in this cycle. t=ceil((b^2-a)^(1/2)); b=lift(Mod(t^2,p)); a=lift(Mod(b^2,p)); if(b<c, break());); \\break at sub sqrt residual localprec(7); z=(b/c/1.); if(z<1,print(z," ",t)); );} |
![]() |
![]() |
![]() |
#5 |
Jun 2021
3×17 Posts |
![]()
No one here? I'm talking with themselves)) Let p=p+1260, and for the same t from code above, residuals will be *slight* different, and p is prime.
How small can be residuals in this case? Or by other word, why we can't find the small residuals for prime p, and they still small for our composite p? |
![]() |
![]() |
![]() |
#6 |
"Viliam Furík"
Jul 2018
Martin, Slovakia
13×61 Posts |
![]()
Okay, I'm here. I've been watching occasionally since the first post.
Could you explain, what it is supposed to do, what does it do, and what is the question? |
![]() |
![]() |
![]() |
#7 |
Jun 2021
3×17 Posts |
![]()
Factorization of numbers.
|
![]() |
![]() |
![]() |
#8 |
"Viliam Furík"
Jul 2018
Martin, Slovakia
13·61 Posts |
![]()
That's too vague.
Please explain the code in the three questions asked. 1. What is it supposed to do? -> Describe the code and algorithm step by step. 2. What does it do? -> You may skip this part if the experienced behaviour is the same as expected behaviour. 3. What is the question? -> Explain the question "Why this code converge?", i.e. what do you mean by that, and how should an answer look like. |
![]() |
![]() |
![]() |
#9 |
Jun 2021
3316 Posts |
![]()
Ok!
1. Code find the values of t>sqrt(p) (p - any number. Can be prime or composite), for those mod(t^2,p)<sqrt(p) and do this in a very unusual way, far away from common approach. Algorithm is quite simple. Take some integer u>sqrt(p), b=mod(u^2,p); a=mod(b^2,p)=mod(u^4,p); [From (b-y)^2==0 mod p b^2-2*b*y+y^2==0 mod p or a-2*b*y+y^2==0; Solution: y=b-sqrt(b^2-a) (and y=b+sqrt(b^2-a), using first) Make y an integer, and compute t= b-y =ceil(sqrt(b^2-a))] So t=ceil(sqrt(b^2-a)). t is some integer) Let u=t, and go all this again, in cycle. After few step, the value of b became less than sqrt(p) (or cycle go to some ring) 2. See 3. 3. why the hell does this even work??? |
![]() |
![]() |
![]() |
#10 |
"Serge"
Mar 2008
Phi(4,2^7658614+1)/2
89·113 Posts |
![]()
Well, how do you know that it actually works? Did you try it on all input values up to some limit and then declare "I tested it for all values up to 10^9 and it converged every time".
Maybe if you did that, then there could have been some discussion? Before you did that how would you convince people that there is anything worth spending their time or even considering spending time reading past 1st post? All you are showing is "I tested this on one number! Look!" Everyone says: "meh" and moves on to reading something else. |
![]() |
![]() |
![]() |
#11 |
Apr 2020
39D16 Posts |
![]()
Write \(\left\lceil \sqrt{b^2-a} \right\rceil\) as \(\sqrt{b^2-a}+\epsilon\), where \(\epsilon\) is between 0 and 1. Let's see what happens when we square this. We get \(b^2-a+2\epsilon\sqrt{b^2-a}+\epsilon^2\).
We know that \(b^2-a\) is a multiple of p, so the value of b on the next iteration is at most (and turns out to be equal to) \(2\epsilon\sqrt{b^2-a}+\epsilon^2\), which is around \(2\epsilon b\) (until b gets close to sqrt(p), when it will tend to be smaller; for b < sqrt(p) it will be 0). In other words, we multiply b by \(2\epsilon\) to get the new value of b. If \(\epsilon\) behaved like a uniformly random number between 0 and 1, then we would expect values of b to decrease in the long term (exercise: what is the expected rate of decrease?). From the Taylor expansion we see that \(\epsilon\) is roughly equal to the fractional part of \(\frac{a}{2b}\). For small b this does behave essentially like a uniformly random number between 0 and 1; for some larger b it is skewed towards the smaller end, but this still means we expect b to fall. There may be a neater explanation; this is just the first thing I came up with. Last fiddled with by charybdis on 2021-07-29 at 02:35 |
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Rho code | Happy5214 | YAFU | 3 | 2015-11-01 21:54 |
Please help me with my code | daxmick | Programming | 15 | 2014-02-14 11:57 |
Code | Primeinator | Software | 20 | 2009-06-11 22:22 |
A little help with VBS code | IronBits | No Prime Left Behind | 6 | 2008-11-12 14:23 |
When will HD and flash drive prices converge? | jasong | Hardware | 9 | 2007-12-22 09:30 |