> I can’t really process this query until you relate the words you’ve used to the math MacKay uses
On Page 1, MacKay posits x as a bit-sequence of an individual. Pick an individual an random. The question at hand is whether the Shannon Entropy of x, for that individual, decreases at a rate of O(1) per generation.
This would be one way to quantify the information-theoretic adaptive complexity of an individual’s DNA.
In contrast, if for some odd reason you wanted to measure the total information-theoretic adaptive complexity of the entire species, then as the population N → â, the total amount of information maxes out in one generation (since, if he had access to the entire population, anyone with a calculator and a lot of spare time could, more-or-less, deduce the entire environment after one generation.)
Cyan,
> I can’t really process this query until you relate the words you’ve used to the math MacKay uses
On Page 1, MacKay posits x as a bit-sequence of an individual. Pick an individual an random. The question at hand is whether the Shannon Entropy of x, for that individual, decreases at a rate of O(1) per generation.
This would be one way to quantify the information-theoretic adaptive complexity of an individual’s DNA.
In contrast, if for some odd reason you wanted to measure the total information-theoretic adaptive complexity of the entire species, then as the population N → â, the total amount of information maxes out in one generation (since, if he had access to the entire population, anyone with a calculator and a lot of spare time could, more-or-less, deduce the entire environment after one generation.)