I want to echo Johnnicholas; your first task is to nail down your definition of information.
Beyond that, I can’t help, but I’m trying something sorta-similar. I’m trying to model life and evolution at the information-theoretic level by starting from a simple control system. I think of the control system in terms of its negentropy/entropy flows and allow mechanisms by which it can become more complex, and I try to include a role for its “understanding of its environment” which I represent by the smallness of its KL divergence of its (implicit) assumptions about the environment from the environment’s true distribution.
Heh, I don’t “have” anything yet; I’m just as the formalism stage. But the idea is that there are units (the control systems) operating within an environment, the latter of which is drawing its state from a lawful distribution (like nature does), which then affects what the units sense, as well as their structure and integrity. Depending on what the units do with the sensory data, they can be effective at controlling certain aspects of themselves, or instead go unstable. The plan is to also allow for modification of the structure of the control systems and their replication (to see evolution at work).
As for modeling the control systems, my focus is first on being able to express what’s going on at the information-theoretic level, where it really gets interesting: there’s a comparator, which must generate sufficient mutual information with the parameter it’s trying to control, else it’s “comparing” to a meaningless value. There are the disturbances, which introduce entropy and destroy mutual information with the environment. There’s the controller, which must use up some negentropy source to maintain the system’s order and keep it from equilibrium (as life and other dissipative systems must). And there’s the system’s implicit model of its environment (including the other control systems), whose accuracy is represented by the KL divergence between the distributions.
I don’t expect I’ll make something completely new, but at least for me, it would integrate my understanding of thermodynamics, life, information theory, and intelligence, and perhaps shed light on each.
I want to echo Johnnicholas; your first task is to nail down your definition of information.
Beyond that, I can’t help, but I’m trying something sorta-similar. I’m trying to model life and evolution at the information-theoretic level by starting from a simple control system. I think of the control system in terms of its negentropy/entropy flows and allow mechanisms by which it can become more complex, and I try to include a role for its “understanding of its environment” which I represent by the smallness of its KL divergence of its (implicit) assumptions about the environment from the environment’s true distribution.
So, you have a simulator in which you implement its control system? This sounds elaborate. I’d like to hear more about it.
Heh, I don’t “have” anything yet; I’m just as the formalism stage. But the idea is that there are units (the control systems) operating within an environment, the latter of which is drawing its state from a lawful distribution (like nature does), which then affects what the units sense, as well as their structure and integrity. Depending on what the units do with the sensory data, they can be effective at controlling certain aspects of themselves, or instead go unstable. The plan is to also allow for modification of the structure of the control systems and their replication (to see evolution at work).
As for modeling the control systems, my focus is first on being able to express what’s going on at the information-theoretic level, where it really gets interesting: there’s a comparator, which must generate sufficient mutual information with the parameter it’s trying to control, else it’s “comparing” to a meaningless value. There are the disturbances, which introduce entropy and destroy mutual information with the environment. There’s the controller, which must use up some negentropy source to maintain the system’s order and keep it from equilibrium (as life and other dissipative systems must). And there’s the system’s implicit model of its environment (including the other control systems), whose accuracy is represented by the KL divergence between the distributions.
I don’t expect I’ll make something completely new, but at least for me, it would integrate my understanding of thermodynamics, life, information theory, and intelligence, and perhaps shed light on each.