View Single Post
Old 2020-07-30, 02:02   #17
preda
 
preda's Avatar
 
"Mihai Preda"
Apr 2015

23·32·19 Posts
Default

A simple information theoretic argument, based on entropy, suggests that (almost all) numbers of N bits have a multiple of 2xN bits with a N/10 binary hamming weight (HW).

(below, log() is in base 2).

It goes like this: if the probability of a bit of being 1 is "p" (and of being 0 is q=1-p), then the information of such a bit is:
p*log(1/p) +q*log(1/q). (Thus the information content of a N-bits number with p=1/2 is N bits). The information content of a number with "loaded" bits (i.e. p != 1/2) is lower per-bit, thus the total number of bits must be increased to compensate. It turns out that when p=1/10, the number of bits needs to be roughly doubled (because 1/10*log(10)+9/10*log(10/9)==0.469 ~= 0.5). So overall, a 5x reduction of the HW is achieved through a doubling of the bit-length.

This argument holds for arbitrary numbers N-bits in length. OTOH primorials are not arbitrary numbers at all. In fact, the information content of a primorial of N-bits in length is not N bits but log(N) bits, thus much much lower. (this can be seen like this: if I want to transmit you a primorial Q#, it's enough to transmit just the value Q, which is O(log(Q#)) in size.

Because of this "low information content" of primorials, it is not excluded that they might have much-lower HW multiples then expected from the above information-theoretic argument. (but not implied either).

Last fiddled with by preda on 2020-07-30 at 02:21
preda is offline   Reply With Quote