Make attention to use the entropy .
The entopy is based on the idea that the amount of information in binary string S is Log(S) , the number of bit to directly code the string .
This is wrong , the correct information is the Kolmogorov complexity of S .
Nowaday the scientific literature don’t focus on this difference and very often use Log(S) instead of K(S) ( K is the Kolmogorov complexity function ) and justify this becouse K(X) is uncomputable and becouse for the major part of K(X) value it is approximable by Log(X) in a mathematical context .
This wrong assumption is done becouse people think that every object , every binary string can happen . This is wrong , only few bit string with lenght 1000000 can happen becouse a system that produce 2^1000000 object , string can not exist.
What this mean is that the object are always small ! The object stay in that value smaller than Log(X) in the function K(X) .
Make attention to use the entropy . The entopy is based on the idea that the amount of information in binary string S is Log(S) , the number of bit to directly code the string . This is wrong , the correct information is the Kolmogorov complexity of S . Nowaday the scientific literature don’t focus on this difference and very often use Log(S) instead of K(S) ( K is the Kolmogorov complexity function ) and justify this becouse K(X) is uncomputable and becouse for the major part of K(X) value it is approximable by Log(X) in a mathematical context . This wrong assumption is done becouse people think that every object , every binary string can happen . This is wrong , only few bit string with lenght 1000000 can happen becouse a system that produce 2^1000000 object , string can not exist. What this mean is that the object are always small ! The object stay in that value smaller than Log(X) in the function K(X) .