Information is a word loaded with associations (including Shannon information, Kolmogorov information, and various lesser-known variants). I would suggest switching to a different, less-loaded term. You seem to be using “information” to mean something like “tech level” in a game like Civilization.
Regarding “tech level” - Using only one dimension for this notion may install blinkers on your thinking. I’ve previously argued that many people are blinkered by using only one dimension for “intelligence”, which is used in futurist rhetoric in much the same way as your “tech level”.
I want to echo Johnnicholas; your first task is to nail down your definition of information.
Beyond that, I can’t help, but I’m trying something sorta-similar. I’m trying to model life and evolution at the information-theoretic level by starting from a simple control system. I think of the control system in terms of its negentropy/entropy flows and allow mechanisms by which it can become more complex, and I try to include a role for its “understanding of its environment” which I represent by the smallness of its KL divergence of its (implicit) assumptions about the environment from the environment’s true distribution.
Heh, I don’t “have” anything yet; I’m just as the formalism stage. But the idea is that there are units (the control systems) operating within an environment, the latter of which is drawing its state from a lawful distribution (like nature does), which then affects what the units sense, as well as their structure and integrity. Depending on what the units do with the sensory data, they can be effective at controlling certain aspects of themselves, or instead go unstable. The plan is to also allow for modification of the structure of the control systems and their replication (to see evolution at work).
As for modeling the control systems, my focus is first on being able to express what’s going on at the information-theoretic level, where it really gets interesting: there’s a comparator, which must generate sufficient mutual information with the parameter it’s trying to control, else it’s “comparing” to a meaningless value. There are the disturbances, which introduce entropy and destroy mutual information with the environment. There’s the controller, which must use up some negentropy source to maintain the system’s order and keep it from equilibrium (as life and other dissipative systems must). And there’s the system’s implicit model of its environment (including the other control systems), whose accuracy is represented by the KL divergence between the distributions.
I don’t expect I’ll make something completely new, but at least for me, it would integrate my understanding of thermodynamics, life, information theory, and intelligence, and perhaps shed light on each.
I define “raw information”, as used in other parts of the model, more precisely, in ways that are supposed to map onto Shannon-information or Kolmogorov information. I used the phrase “tech level” because my initial expectation is that power is proportional to the log of raw information. Some of my data concerning the rate of progress instead uses something with a meaning more like “perceived social change” or “useful information”, which I called “tech level”, and seems to be the log of raw information.
It may be that “useful information” really is Shannon information, and “raw information” is uncompressed, redundant information; and that this accounts for observations that “useful information” appears to be the log of “raw information”. For instance, we have an exponential increase in the number of genes sequenced; but probably a much-less-than-linear increase in the number of types of genes known. We have an exponential increase in journal articles published; but the amount of independent, surprising information in each article may be going down.
A (thermal, say) random number generator is easy to build and a good source of both Shannon and algorithmic (Kolmogorov) information. Having lots of information in these senses is not helpful for winning battles.
probably a much-less-than-linear increase in the number of types of genes known
I should clarify: We still have an exponential increase in the number of protein families known; but a less-than-linear increase in the number of protein domains known. Proteins are composed of modules called “domains”; a protein contains from 1 to dozens of domains. Most “new” genes code for proteins that recombine previously-known domains in different orders.
A digression: Almost all of the information content of an organism resides in the amino-acid sequence of these domains; and a lower bound of about 64% of domains (and 84% of those found in eukaryotes) evolved before eukaryotes (which include all multicellular organisms) split from prokaryotes about 2 billion years ago. (One source: Michael Levitt, PNAS July 7 2009, “Nature of the protein universe”.) So it’s accurate to say that most of evolution occurred in the first billion years; the development of more-complex organisms seems to have nearly frozen evolution of the basic components. We would likely be more complex today if those ancient prokaryotes had been able to hold off evolving into eukaryotes for another billion years, so that they could develop more protein domains first. There’s a lesson for aspiring singularians in there somewhere.
(Similarly, most evolution within eukaryotes seems to have occurred during a period of about 50 million years, just before the Cambrian explosion, half a billion years ago. Evolution has been slowing down in information-theoretic terms, while speeding up in terms of intelligence produced.)
Information is a word loaded with associations (including Shannon information, Kolmogorov information, and various lesser-known variants). I would suggest switching to a different, less-loaded term. You seem to be using “information” to mean something like “tech level” in a game like Civilization.
Regarding “tech level” - Using only one dimension for this notion may install blinkers on your thinking. I’ve previously argued that many people are blinkered by using only one dimension for “intelligence”, which is used in futurist rhetoric in much the same way as your “tech level”.
I want to echo Johnnicholas; your first task is to nail down your definition of information.
Beyond that, I can’t help, but I’m trying something sorta-similar. I’m trying to model life and evolution at the information-theoretic level by starting from a simple control system. I think of the control system in terms of its negentropy/entropy flows and allow mechanisms by which it can become more complex, and I try to include a role for its “understanding of its environment” which I represent by the smallness of its KL divergence of its (implicit) assumptions about the environment from the environment’s true distribution.
So, you have a simulator in which you implement its control system? This sounds elaborate. I’d like to hear more about it.
Heh, I don’t “have” anything yet; I’m just as the formalism stage. But the idea is that there are units (the control systems) operating within an environment, the latter of which is drawing its state from a lawful distribution (like nature does), which then affects what the units sense, as well as their structure and integrity. Depending on what the units do with the sensory data, they can be effective at controlling certain aspects of themselves, or instead go unstable. The plan is to also allow for modification of the structure of the control systems and their replication (to see evolution at work).
As for modeling the control systems, my focus is first on being able to express what’s going on at the information-theoretic level, where it really gets interesting: there’s a comparator, which must generate sufficient mutual information with the parameter it’s trying to control, else it’s “comparing” to a meaningless value. There are the disturbances, which introduce entropy and destroy mutual information with the environment. There’s the controller, which must use up some negentropy source to maintain the system’s order and keep it from equilibrium (as life and other dissipative systems must). And there’s the system’s implicit model of its environment (including the other control systems), whose accuracy is represented by the KL divergence between the distributions.
I don’t expect I’ll make something completely new, but at least for me, it would integrate my understanding of thermodynamics, life, information theory, and intelligence, and perhaps shed light on each.
I define “raw information”, as used in other parts of the model, more precisely, in ways that are supposed to map onto Shannon-information or Kolmogorov information. I used the phrase “tech level” because my initial expectation is that power is proportional to the log of raw information. Some of my data concerning the rate of progress instead uses something with a meaning more like “perceived social change” or “useful information”, which I called “tech level”, and seems to be the log of raw information.
It may be that “useful information” really is Shannon information, and “raw information” is uncompressed, redundant information; and that this accounts for observations that “useful information” appears to be the log of “raw information”. For instance, we have an exponential increase in the number of genes sequenced; but probably a much-less-than-linear increase in the number of types of genes known. We have an exponential increase in journal articles published; but the amount of independent, surprising information in each article may be going down.
A (thermal, say) random number generator is easy to build and a good source of both Shannon and algorithmic (Kolmogorov) information. Having lots of information in these senses is not helpful for winning battles.
True. However, I’m considering information that’s not at all random, so I don’t think that’s a problem.
I should clarify: We still have an exponential increase in the number of protein families known; but a less-than-linear increase in the number of protein domains known. Proteins are composed of modules called “domains”; a protein contains from 1 to dozens of domains. Most “new” genes code for proteins that recombine previously-known domains in different orders.
A digression: Almost all of the information content of an organism resides in the amino-acid sequence of these domains; and a lower bound of about 64% of domains (and 84% of those found in eukaryotes) evolved before eukaryotes (which include all multicellular organisms) split from prokaryotes about 2 billion years ago. (One source: Michael Levitt, PNAS July 7 2009, “Nature of the protein universe”.) So it’s accurate to say that most of evolution occurred in the first billion years; the development of more-complex organisms seems to have nearly frozen evolution of the basic components. We would likely be more complex today if those ancient prokaryotes had been able to hold off evolving into eukaryotes for another billion years, so that they could develop more protein domains first. There’s a lesson for aspiring singularians in there somewhere.
(Similarly, most evolution within eukaryotes seems to have occurred during a period of about 50 million years, just before the Cambrian explosion, half a billion years ago. Evolution has been slowing down in information-theoretic terms, while speeding up in terms of intelligence produced.)