If I understand this right, they’re not talking about entropy. They’re talking about putting yourself in a position where you have more choices. I think a better word would be power.
They clearly say are talking about entropy. As do most of their cites. I think “power” would be the wrong term.
Power maximisation and entropy maximisation are closely related concepts—but these ideas can be teased apart:
Given the opportunity, evolved organisms can be expected to spitefully destroy resources accessible only to other unrelated agents—assuming that doing so is inexpensive. I think that entropy maximisation is more clearly consistent with such behaviour than power maximisation is.
Also, entropy is a global measure—while power is a measure of flux in some kind of pipe. Entropy maximisation is thus a simpler idea.
I can’t actually read the paper, but according to the accompanying article jamesf linked to:
Hoping to firm up such notions, Wissner-Gross teamed up with Cameron Freer of the University of Hawaii at Manoa to propose a “causal path entropy.” This entropy is based not on the internal arrangements accessible to a system at any moment, but on the number of arrangements it could pass through on the way to possible future states.
They are talking about “causal path entropy”, a term they defined, not “entropy”, the well known physics term. Confusing them would be a bad idea.
Power maximisation and entropy maximisation are closely related concepts
Power maximization is what an intelligent agent does that values power. What you linked to is a statistical tool for finding priors. They are unrelated. Am I misunderstanding something?
You need some background, by the sound of it. The main link between power maximization and entropy maximization is that power is usually acquired in order to perform work, and doing work eventually leads to generating entropy. So: the two ideas often make similar predictions.
If I understand this right, they’re not talking about entropy. They’re talking about putting yourself in a position where you have more choices. I think a better word would be power.
They clearly say are talking about entropy. As do most of their cites. I think “power” would be the wrong term.
Power maximisation and entropy maximisation are closely related concepts—but these ideas can be teased apart:
Given the opportunity, evolved organisms can be expected to spitefully destroy resources accessible only to other unrelated agents—assuming that doing so is inexpensive. I think that entropy maximisation is more clearly consistent with such behaviour than power maximisation is.
Also, entropy is a global measure—while power is a measure of flux in some kind of pipe. Entropy maximisation is thus a simpler idea.
I can’t actually read the paper, but according to the accompanying article jamesf linked to:
They are talking about “causal path entropy”, a term they defined, not “entropy”, the well known physics term. Confusing them would be a bad idea.
Power maximization is what an intelligent agent does that values power. What you linked to is a statistical tool for finding priors. They are unrelated. Am I misunderstanding something?
Here.
You need some background, by the sound of it. The main link between power maximization and entropy maximization is that power is usually acquired in order to perform work, and doing work eventually leads to generating entropy. So: the two ideas often make similar predictions.
As for the link between the Maximum entropy principle of E.T. Jaynes and Maximum entropy thermodynamics, the best resource on that topic which I am aware of is:
Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states by Roderick Dewar, particularly the historical overview in the introduction—at the beginning.