mersenneforum.org #of divisors Series
 Register FAQ Search Today's Posts Mark Forums Read

 2022-08-06, 03:44 #23 Viliam Furik     "Viliam Furík" Jul 2018 Martin, Slovakia 31916 Posts I've made a program that calculates the next number according to the rule n * d(n) / (gcd(n, d(n)) ^2), and run it for all numbers smaller or equal to 1,000,000. For every number, it calculates its chain of following numbers until it finds a number that it has already seen (all numbers and their successors are stored), and then continues with the next unseen number from the range 1 to 1,000,000. This way it shoots outside of the range, and for this range, it went through another 1,205,464 numbers. The highest number it reached is 13,432,298,880 (which fell down to 839,518,680). The highest jump ratio (ratio of increase) is 128, achieved by 2,297,295 by jumping to 294,053,760. The highest fall ratio (ratio of decrease) is 720, achieved by 103,783,680 by falling to 144,144. These jump and fall ratios can be arbitrarily large. For every prime number p > 2, the following holds as the pattern in its chain: 2^(p-1) ----[* (p)]---> 2^(p-1) * p^1 ----[/(2*p)]---> 2^(p-2)......(further the pattern depends on what are the factors of p-2, thus it ceases to be general) 2^(7-1) --> 2^(7-1)*7^1 --> 2^(7-2) The exponent can also be a product of single powers of odd primes, which makes the jump that much greater, and the fall much greater, as the additional primes will bring more 2's into the factorization of the number of divisors, leading to division by not 2*p, but 2 to some greater power and the product of the primes. All numbers searched in this range ended up in some cycle, and none shoot off to infinity. I haven't yet managed to make a working search program for the longest cycle, but when I make it work, I will post the results. I have also searched up to 10,000,000 - everything the same as with 1,000,000, just the upper limit set higher. The record jump still belongs to 2,297,295, but the record fall now belongs to 11,174,042,880, with a fall ratio of 2304. There were 12,885,164 numbers searched greater than 10,000,000 (that contains all that was already searched from the numbers under 1,000,000). The highest number reached is 134,396,734,080. There seems to be some interesting pattern in what the highest number reached is, as for these searches it was 13432.29888 times the 1,000,000 limit, and 13439.673408 times the 10,000,000 limit. I also did a search under 100,000, and the highest number reached is 1,339,578,240 - 13395.7824 times the 100,000 limit. The numbers from under the limits that reached those highest numbers are 99671 (under 100,000), and 999427 (under 1,000,000). These ratio numbers seem to converge. For the limit 4,000,000, the ratio of the highest reached number and the limit is 13439.50608, just a tad lower than the ratio from the 10,000,000 limit.
 2022-08-08, 16:16 #24 henryzz Just call me Henry     "David" Sep 2007 Liverpool (GMT/BST) 37×163 Posts Interesting. I have modified bur's pari gp code to support the new form and to detect loops(not the length) Code: tau(n) = { t = 1; f = factor(n); for(i = 1, matsize(f)[1], t *= f[i,2]+1 ); return(t); } series(norig,j) = { i = 0; loop = n; n=norig; until(i == j, t = tau(n); \\ For squarefree experiments checks if any prime factors in tau are above 1000. Should be commented most of the time \\if (vecmax(factor(t)[,1])>2000,print("t too large");break()); \\print(n = n*t/(gcd(n,t)^2)); n = n*t/(gcd(n,t)^2); \\print(n, " " factor(n/norig)); i++; if (n==loop,break()); if (i==5,loop=n); \\if (i==10,loop=n); if (i==25,loop=n); if (i==250,loop=n); if (i==2500,loop=n); if (n=11 seem to create longer sequences. p^4 also can appear for p>=7. I would suggest trying to limit the search area and work out how to loop over the reduced search area. For example some classes of squarefree numbers could be removed. For example assuming n has no small factors that interact with the sequence: n=x*y -> 2^2*n -> 3*n -> 2^3*3*n -> 2^2*3*n -> 2*n -> 2^2*n n=x*y*z -> 2^3*n -> 2^2*n -> 2*3*n -> 2^4*3*n -> 3*5*n -> 2^5*3*5*n -> 2*5*n -> 2^4*5*n -> n I ran the following code which tries to termininate squarefree sequences with upto 926 factors. To check that no factors of tau reached 2000 I added an extra check. x=1; forprime(p=2007, 10000, x=x*p;series(x,10000);) This doesn't eliminate all cases though as n can have small factors that interact with the extra factors that are varying.
 2022-08-09, 15:59 #25 Viliam Furik     "Viliam Furík" Jul 2018 Martin, Slovakia 13×61 Posts I have gone through the prime powers from p^2 to p^9, manually on paper (quite easy, and I could have done more, but I decided to rather make the program for this). I have found that they all end in a cycle, and also discovered that when the power is one less than a prime q, it goes into a 2-cycle alternating between q^1*p^q-1 and 2^1*p^q-1. Since the powers are always 1 and q-1, it will remove one common part (the gcd of n and the number of divisors), and replace it with the other. For the program, I've used Python and stored the factorizations in dictionaries with primes as keys and their powers as values. These unspecified primes can be done as a side dictionary, that basically never changes, thus it can be basically a constant (when this constant is 1, it works as usual, with all primes specified) with the product of the powers +1 of the unspecified primes (I haven't written this code yet, but I have the idea made up). The program does everything as usual but multiplies the number of factors by this constant, and then goes on. That creates a pattern that always holds, except when the unspecified primes are the same as any primes that appear specified in the chain. Because of the behavior of the dictionary as a changeable constant, the constant can be used directly, and any patterns which have unspecified primes having their number of factors equal to the constant will hold (for all unspecified primes which are different from the primes which appear specified in the chain). The unspecified primes have to always be different from each other if denoted so, of course. I've also found a nice property: If we have an odd number N, that becomes X in this sequence, then 2 * N will become X as well. That is because 2 goes directly to 1, because it is 2^(2-1), thus dividing itself out. This allows it to not affect the outcome, as long it is only of power 1. I've tried to find some sort of multiplicative property in this series - something like S(a * b) = S(a) * S(b), for a, b such that gcd(a, b) = 1) - but no success yet. As S(a * b) = S(a) * S(b) sometimes holds, but sometimes the product a * b divides further. It is a question of when gcd(a * b, d(a * b)) = gcd(a, d(a)) * gcd(b, d(b)). d(a * b) = d(a) * d(b) holds, so that makes the numerator straightforward, but the denominator makes mess. Numerator and denominator in (dots replace multiple spaces, treat them as if they weren't there): S(a) * S(b) = .....a * d(a)..............b * d(b) ----------------- * ------------------ = gcd(a, d(a))^2.....gcd(b, d(b))^2 ..........a * b * d(a) * d(b) ------------------------------------- gcd(a, d(a))^2 * gcd(b, d(b))^2 S(a * b) = ....a * b * d(a * b) ------------------------ gcd(a * b, d(a * b))^2 As d(a * b) = d(a) * d(b), the numerators are equal. The denominators are not necessarily equal. Also, it seems that numbers of 3^2*p^1 (p can be 3) are not S(n) for any n. That seems to be true also for 5^4*p^1 (for p=5, 3125 is S(225000)). I've searched in the output from the search under 4,000,000, so some numbers which have not been searched might have these as their S(n). If they truly are alone, that would mean that not all numbers belong to a cycle, thus are only leave nodes in the total graph produced. But it wouldn't affect the question of whether the graph is connected or whether there are numbers that are in entirely disconnected cycles. Whether there are numbers that shoot off to infinity is related to whether all numbers eventually end in some cycle, but that's independent of the graph of S(n) being connected. 64 seems to also not be produced from any number, which is very odd as it's a power of 2, and these are quite special since p^1 has 2 factors (thus making 2 very common in the factorization of d(n) for n with bounded size (I think it might have non-zero density amongst all the values of d(n)), and S(2) = 1.
 2022-08-14, 03:09 #26 Viliam Furik     "Viliam Furík" Jul 2018 Martin, Slovakia 79310 Posts In case you'd (or anyone else) like to have a go with my code and try something: Code: from math import gcd import gmpy2 import time def gmpysieve(pm): sieve_limit = gmpy2.isqrt(pm) + 1 pm += 1 bitmap = gmpy2.xmpz(3) bitmap[4:pm:2] = -1 for p in bitmap.iter_clear(3, sieve_limit): bitmap[p*p:pm:p+p] = -1 return bitmap def primefactorization(factorprimes, n): ncopy = n factorization = {} for p in range(2100): if p * p > ncopy: break if not factorprimes[p]: if ncopy % p == 0: factorization[p] = 0 while ncopy % p == 0: factorization[p] += 1 ncopy = ncopy // p if ncopy > 1: if ncopy > 4000000: print(n, ncopy, "probable_factorization_failure") factorization[ncopy] = 1 return factorization start = time.perf_counter_ns() primes = gmpysieve(4000001) factorprimes = gmpysieve(2100) series = {1: 1, 2: 1, 3: 6, 4: 12, 5: 10} for a in range(6, 4000001): series[a] = 0 for b in range(3, 4000001): if not primes[b]: series[b] = 2 * b series[2 * b] = 2 * b print("check1") for c in range(6, 4000001): if c == 400000: print("check2") if series[c] == 0: num = c while True: if series[num] != 0: break numfactorization = primefactorization(factorprimes, num) numfactors = 1 for d in numfactorization: numfactors *= numfactorization[d] + 1 commonpart = gcd(num, numfactors) nextnum = num // commonpart nextnum *= numfactors // commonpart series[num] = nextnum if nextnum not in series: series[nextnum] = 0 num = nextnum print("check3") # file0 = open("output_dict_10000000.txt", "a") file1 = open("nextnumbers_4000000.txt", "a") # file0.write(str(series)) for e in series: file1.write(str(e) + ";" + str(series[e]) + "\n") end = time.perf_counter_ns() print(end - start) The factorization error message is there because of a theoretical possibility of the factorization containing a composite number, whose factors are bigger than the primes tested in finding the factorization. It shouldn't be possible, as the factors are preserved from numbers and their numbers of divisors, but just in case. The gmpysieve function returns primes under "pm" in a bitmap form, with the i-th bit representing the number i, if 1 then composite, if 0 then prime. The factorization itself is a big part of the time it takes to complete, so I've made it as fast as I could. There might be still space to improve, but that might be just for a tiny improvement until we hit some other bottleneck. And also my sorting program: Code: file1 = open("nextnumbers_4000000.txt", "r") file2 = open("nextnumbers_4000000_sorted.txt", "a") pairs = [] for a in file1: b = a.split(";") pairs.append([int(b[0]), int(b[1])]) pairs.sort(key=lambda x: x[0]) for c in pairs: file2.write(str(c[0]) + ";" + str(c[1]) + "\n")` Just replace the 4,000,001 with whatever bound you want + 1, and the 2100 with the square root of that bound, plus a bit.

 Similar Threads Thread Thread Starter Forum Replies Last Post MattcAnderson MattcAnderson 0 2021-06-16 07:09 MattcAnderson MattcAnderson 1 2021-06-15 17:36 MattcAnderson MattcAnderson 3 2021-06-14 16:24 firejuggler Prime Sierpinski Project 2 2012-01-10 17:14 Citrix Math 10 2006-02-08 04:09

All times are UTC. The time now is 14:21.

Sun Feb 5 14:21:51 UTC 2023 up 171 days, 11:50, 1 user, load averages: 0.90, 1.00, 1.00