The thing that leaps out at me is that the rhetorical equation in that article between the sexiness of a woman being in the mind and the probability of two male children being in the mind is bogus.
I look at a woman and think she is sexy. If I assume the sexiness is in the woman, and that an alien creature would think she is sexy, or my wife would think she is sexy, because they would see the sexiness in her, then the article claims I have been guilty of the mind projection fallacy because the woman’s sexiness is in my mind, not in the woman.
The article then proceeds to enumerate a few situations in which I am given incomplete information about reality and each different scenario corresponds to a different estimate that a person has two boy children.
BUT… it seems to me, and I would love to know if Eliezer himself would agree, even an alien given the same partial information would, if it were rational and intelligent, reach the same conclusions about the probabilities involved! So… probability, even Bayesian probability based on uncertainty is no more or less in my head than is 1+1=2. 1+1=2 whether I am an Alien mind or a Human mind, unlike that woman is sexy which may only be true in heterosexual male, homosexual female, and bisexual human minds, but not Alien minds.
But be that as it may, your comment still ignores the entire discussion, which is is Entropy and more or less “real” than Energy? The fact is that Aliens who had steam engines, internal combustion engines, gas turbines, and air conditioners would almost certainly have thermodynamics, and understand entropy, and agree with Humans on the laws of thermodynamics and the trajectories of entropy in the various machines.
If Bayesian probability is in the mind, and Entropy is in the mind, then they are like 1+1=2 being in the mind, things which would be in the mind of anything which we considered rational or intelligent. They would NOT be like “sexiness.”
Probability depends on state of knowledge, which is a fact about your mind. Another agent with the same state of knowledge will assign the same probabilities. Another agent fully aware of your state of knowledge will be able to say what probabilities you should be assigning.
Sexiness depends on sexual preferences, which are a fact about your mind. Another agent with the same sexual preferences will assess sexiness the same way. Another agent fully aware of your sexual preferences will be able to say how sexy you will find someone.
I don’t see that there’s a big difference here. Except maybe for the fact that “states of knowledge”, unlike “sexual preferences”, can (in principle) be ranked: it’s just plain better for your state of knowledge to be more accurate.
Well yes. Of course everything you can say about probability and sexiness you can say about Energy, Entropy, and Apple. That is, the estimate of the energy or entropy relationships in a particular machine or experimental scenario depend on the equations for energy and entropy, the measurements you make on the system to find the values of the elements that go into those equations. Any mind with the same information will reach the same conclusions about the Energy and Entropy that you would, assuming you are all doing it “right.” Any intelligence desiring to transform heat producing processes into mechanical or electrical energy will even discover the same relationships to calculate energy and entropy as any other intelligence and will build similar machines, machines that would not be too hard for technologists from the other civilization to understand.
Even determining if something is an apple. Any set of intelligences that know the definitions of apples common among humans on earth will be able to look at various earth objects and determine which of them are apples, which are not, and which are borderline. (I’m imagining there must be some “crabapples” that are marginally edible that people would argue over whether to call apples or not, as well as a hybrid between an apple and a pear that some would call an apple and some wouldn’t).
So “Apple” “Sexy” “Entropy” “Energy” and “Probability” are all EQUALLY in the mind of the intelligence dealing with them.
If you check, you will see this discussion started by suggesting that Energy was “realer” than Entropy. That Entropy was more like Probability and Sexiness, and thus, not as real, while Energy was somehow actually “out there” and therefore realer.
My contention is that all these terms are equally as much in the mind as in reality, that as you say any intelligence who knows the definitions will come up with the same conclusions about any given real situation, and that there is no distinction in “realness” between Energy and Entropy, no distinction between these and Apple, and indeed no distinction between any of these and “Bayesian Probability.” That pointing out that features of the map are not features of the territory does NOT allow you to privilege some descriptive terms as being “really” part of the territory after all, even though they are words that can and should obviously be written down on the map.
If you are going to explicate further, please state whether you agree or disagree that some of these terms are realer than others, as this is how the thread started and open-ended explication is ambiguous.
So “Apple” “Sexy” “Entropy” “Energy” and “Probability” are all EQUALLY in the mind of the intelligence dealing with them.
Anything at all is “in the mind” in the sense that different people might for whatever reason choose to define the words differently. Because this applies to everything, it’s not terribly interesting and usually we don’t bother to state it. “Apple” and “energy” are “in the mind” in this sense.
But (in principle) someone could give you a definition of “energy” that makes no reference to your opinions or feelings or health or anything else about you, and be confident that you or anyone else could use that definition to evaluate the “energy” of a wide variety of systems and all converge on the same answer as your knowledge and skill grows.
“Entropy” (in the “log of number of possibilities” sense) and “probability” are “in the mind” in another, stronger sense. A good, universally applicable definition of “probability” needs to take into account what the person whose probability it is already knows. Of course one can define “probability, given everything there is to know about mwengler’s background information on such-and-such an occasion” and everyone will (in principle) agree about that, but it’s an interesting figure primarily for mwengler on that occasion and not really for anyone else. (Unlike the situation for “energy”.) And presumably it’s true that for all (reasonable) agents, as their knowledge and skill grow, they will converge on the same probability-relative-to-that-knowledge for any given proposition—but frequently that won’t in any useful sense be “the probability that it’s true”, it’ll be either 0 or 1 depending on whether the proposition turns out to be true or false. For propositions about the future (assuming that we fix when the probability is evaluated) is might end up being something neither 0 nor 1 for quantum-mechanical reasons, but that’s a special case.
Similarly, entropy in the “log of number of possibilities” sense is meaningful only for an agent with given knowledge. (There is probably a reasonably respectable way of saying “relative to what one could find out by macroscopic observation, not examining the system too closely”, and I think that’s often what “entropy” is taken to mean, and that’s fine. But that isn’t quite the meaning that’s being advocated for in this post.)
Sexiness is “in the mind” in an even stronger sense, I suppose. But I think it’s reasonable to say that on the scale from “energy” to “sexiness”, probability is a fair fraction of the way towards “sexiness”.
“Entropy” (in the “log of number of possibilities” sense) and “probability” are “in the mind” in another, stronger sense.
Aha! So it would seem the original sense that “Energy” is “realer” (more like Apple) than Entropy is because Entropy is associated with Probability, and Bayesian Probability, the local favorite, is more in the mind than other things because its accurate estimation requires information about the state of knowledge of the person estimating it.
So it is proposed there is a spectrum “in the mind” (or dependent on other things in the mind as well as things in the real world) to “real” (or in the mind only to the extent that it depends on definitions all minds would tend to share).
We have Sexiness is in the mind, and thinking it is in reality is a projection fallacy. At the other end of the spectrum, we have things like Energy and Apple which are barely in the mind, which depend in straightforward ways on straightforward observations of reality, and would be agreed upon by all minds that agreed on the definitions.
And then we have probability. Frequentist definitions of probability are intended to be like Energy and Apple, relatively straightforward to calculate from easy to define observations.
But then we have Bayesian probability, which is a statement which links our current knowledge of various details with our estimate of probability. So considering that different minds can have different bits of other knowledge in them than other minds, different minds can “correctly” estimate different probabilities for the same occurrences, just as different minds can estimate different amounts of sexiness for the same creatures, depending on the species and genders of the different minds.
And then we have Entropy. And somebody defines Entropy as the “log of number of possibilities” and possibilities are like probabilities, and we prefer Bayesian “in the mind” probability to Frequentist “in reality” definitions of probability. And so some people think Entropy might be in the mind like Bayesian probability and sexiness, rather than in reality like Energy and Apple.
Good summary? I know! It is!
So here is the thing. Entropy in physics is defined as
That is, the entropy is very deterministically added to a system by heating the system with an unambiguously determined amount of energy dQrev, and dividing that amount of energy by an unambiguously determined temperature of the system. That sure doesn’t look like it has any probabilities in it. So THIS definition of Entropy is as real as Energy and Apple. And this is where I have been coming from. You me and an alien from Alpha Centauri can all learn the thermodynamics required to build steam engines, internal combustion engines, and refrigerators, and we will all find the same definitions for Energy and Entropy (however we might name them), and we will all determine the same trajectories in time and space for Energies and Entropies for any given thermodynamic system we analyze. Entropy defined this way is as real as Energy and Apples.
But what about that “log of number of possibilities” thing? Well a more pedantic answer would be, that the number of possibilities has nothing to do with probabilities. I have a multiparticle state with known physics of interactions. Its state when first specified, the possibility it initially occupies, has a certain amount of energy associated with it. The energy (we consider only closed systems for now) will stay constant, and EVERY possible point in parameter space which has the SAME energy as our initial state shows up on our list of possibilities for the system, and every point in parameter space with a DIFFERENT energy than our initial state is NOT a possible state of this system.
So counting the possibilities does NOT seem to involve any Bayesian probabilities at all. You, me, and an alien from Alpha Centauri who all look at the same system all come up with the same Entropy curves, just as we all come up with the same energy curves.
But perhaps I can do better than this. Tie this in to the intuition that entropy has something to do with probabilities. And I can.
The probabilities that entropy has to do with are FREQUENTIST probabilities. Enumerations of the physically possible states of the system. We could estimate them mathematically by hypothesizing a map of the system called parameter space, or we could take 10^30 snapshots of the physical system spread out over many millenia and just observe all the states the system gets into. Of course this second is impractical, but when has impractical ever stopped a lesswrong discussion?
So the real reason Entropy, Energy and Apple are “real” even though Bayesian Probability like Sexiness is “in the mind” is because Entropy is unambiguously defined for physical systems in terms of other unambiguous physical quantities “Energy” and “Temperature.” (BTW, Temperature is Average Kinetic Energy of the particles, not some ooky “in the mind” mind thing. Or for simplicity, define temperature as what the thermometer tells you.)
And to the extent you love Bayesian probability so much that you want somehow to interpret a list of states in parameter space that all have the same energy as somehow “in the mind,” you just need to realize that a frequentist interpretation of probability is more appropriate for any discussion of entropy than is a bayesian one: we use entropy to calculate what systems we know “enough” about will do, not to estimate how different people in different states of ignorance will bet on what they will do. If we enumerate the states wrong we get the wrong entropy and our engine doesn’t work the way we said it would, we don’t get to be right, in the subjective sense that our estimate was as good as it could be given what we knew.
I hope this is clear enough to be meaningful to anybody following this topic. It sure explains to me what has been going on.
So here’s the thing. Entropy in physics is defined as [...]
That is one definition. It is not the only viable way to define entropy. (As you clearly know.) The recent LW post on entropy that (unless I’m confused) gives the background for this discussion defines it differently, and gives the author’s reasons for preferring that definition.
(I am, I take it like you, not convinced that the author’s reasons are cogent enough to justify the claim that the probabilistic definition of entropy is the only right one and that the thermodynamic definition is wrong. If I have given a different impression, then I have screwed up and I’m sorry.)
“Log of #possibilities” doesn’t have any probabilities in it, but only because it’s a deliberate simplification, targetting the case where all the probabilities are roughly equal (which turns out not to be a bad approximation because there are theorems that say most states have roughly equal probability and you don’t go far wrong by pretending those are the only ones and they’re all equiprobable). The actual definition, of course, is the “—sum of p log p” one, which does have probabilities in it.
So, the central question at issue—I think—is whether it is an error to apply the “—sum of p log p” definition of entropy when the probabilities you’re working with are of the Bayesian rather than the frequentist sort; that is, when rather than naively counting states and treating them all as equiprobable you adjust according to whatever knowledge you have about the system. Well, of course you can always (in principle) do the calculation; the questions are (1) is the quantity you compute in this way of any physical relevance? and (2) is it appropriate to call it “entropy”?
Now, for sure your state of knowledge of a system doesn’t affect the behaviour of a heat engine constructed without the benefit of that knowledge. If you want to predict its behaviour, then (this is a handwavy way of speaking, but I like it) the background knowledge you need to apply when computing probabilities is what’s “known” by the engine. And of course you end up with ordinary thermodynamic entropy. (I am fairly sure no one who has been talking about entropy on LW recently would disagree.)
But suppose you know enough about the details of a system that the entropy calculated on the basis of your knowledge is appreciably different from the thermodynamic entropy; that is, you have extra information about which of its many similar-looking equal-energy states it’s more likely to be in. Then (in principle, as always) you can construct an engine that extracts more energy from the system than you would expect from the usual thermodynamic calculations.
Does this make this “Bayesian entropy” an interesting quantity and justify calling it entropy? I think so, even though in almost all real situations it’s indistinguishable from the thermodynamic entropy. If you start out with only macroscopic information, then barring miracles you’re not going to improve that situation. But it seems to me that this notion of entropy may make for a simpler treatment of some non-equilibrium situations. Say you have a box with a partition in it, gas on one side and vacuum on the other. Now you remove the partition. You briefly have extra information about the state of what’s in the box beyond what knowing the temperature, volume and pressure gives you, and indeed you can exploit that to extract energy even if once the gas settles down its temperature is the same as that of its environment. I confess I haven’t actually done the calculations to verify that the “Bayesian” approach actually leads to the right answers; if (as I expect) it does, or can be adjusted in a principled way so that it does, then this seems like a nice way of unifying the equilibrium case (where you talk about temperature and entropy) and the non-equilibrium case (where you have to do something more resembling mechanics to figure out what energy you can extract and how). And—though here I may just be displaying my ignorance—I don’t see how you answer questions like “10ms after the partition is removed, once the gas has started flowing into the previously empty space, but isn’t uniformly spread out yet, what’s the entropy of the system?” without something resembling the Bayesian approach, at least to the extent of not assuming all microstates are equally probable.
[EDITED to add: I see you’ve already commented on the “extracting energy from a thermodynamically hot thing whose microstate is known” thing, your answer being that the machine you do it with needs to be very cold and that explains how you get energy out. But I haven’t understood why the machine has to be very cold. Isn’t it, in fact, likely to have lots of bits moving very fast to match up somehow with the molecules it’s exploiting? That would make it hot according to the thermodynamic definition of temperature. I suppose you might argue that it’s really cold because its state is tightly controlled—but that would be the exact same argument that you reject when it’s applied to the hot thing the machine is exploiting its knowledge of.]
OK this is in fact interesting. In an important sense you have already won, or I have learned something, whichever description you find less objectionable.
I still think that the real definition of entropy is as you originally said, the log of the number of allowable states, where allowable means “at the same total energy as the starting state has.” To the extent entropy is then used to calculate the dynamics of a system, this unambiguous definition will apply when the system moves smoothly and slowly from one thermal equilibrium to another, as some macrosopic component of the system changes “slowly,” slowly enough that all intermediate steps look like thermal equilibria, also known in the trade as “reversibly.”
But your “10 ms after the partition removed” statement highlights that the kinds of dynamics you are thinking of are not reversible, not the dynamics of systems in thermal equilibrium. Soon after the partition is removed, you have a region that used to be vacuum that has only fast moving molecules in it, the slow moving ones from the distribution haven’t had time to get there yet! Soon after that when the fast molecules are first reaching the far wall, you have some interesting mixing going on involving fast molecules bouncing off the wall and hitting slower molecules still heading towards the wall. And in a frame by frame sense, and so on and so on.
Eventually (seconds? Less?) zillions (that’s a technical term) of collisions have occurred and the distributions of molecular speeds in any small region of the large volume is a thermal distribution, at a lower temperature than the original distribution before the partition was removed (gasses cool on expansion). But the details of how the system got to this new equilibrium are lost. The system has thermalized, come to a new thermal equilibrium.
I would still maintain that formally, the log of the number of states is a fine definition, that the entropy thus defined is as unambiguous as “Energy,” and that it is as useful as energy.
If you start modifying the “entropy,” if you start counting some states more than others, there are two reasons that might make sense. 1) you are interested in non-thermal-equilibrium dynamics, and given a particular starting state for the system, you want to count only the parameter states the system could reach in some particular short time frame, or 2) you are equivocating, pretending that your more complete knowledge of the starting point of the system than someone else had gives your entropy calculation an “in your mind” component when all it does is mean at least one of the minds making the calculation was making it for a different system than the one in front of them.
In the case of 1) non-equilibrium dynamics is certainly a reasonable thing to be interested in. However, the utility of the previously and unambiguously defined entropy in calculating the dynamics of systems which reach thermal equilibrium is so high that it really is up to those who would modify it, to modify the name describing it as well. So the entropy-like calculation that counts only states reachable after 10 ms might be called the “prompt entropy” or the “evolving entropy.” It really isn’t reasonable to just call it entropy and then claim an “in your mind” component to the property of entropy, because in your mind you are actually doing something different from what everybody else is doing.
In the case of 2), where you look in detail at the system and see a different set of states the system can get to than someone else who looked at the system saw, then it is not a matter of entropy being in your mind that distinguishes, it is a situation of one of you being wrong about what the entropy is. And my calling an orange “an Apple” no more makes Apple ambiguous than my saying 2+2=5 calls into question the objective truth of addition.
As to the machine that subtracts extra energy… Consider an air jet blowing a stream of high pressure air into a chamber with a piston in it. THe piston can move and you can extract energy. Someone using thermo to build an engine based on this might just calculate the rise in pressure in the volume as the air jet blows into it, and put the piston in a place where the air jet is not blowing directly on to it, and might then find their machine performs in a way you would expect from a thermo calculation. I.e. they might build their machine so the energy from air jet is “thermalized” with the rest of the air in the volume before it pushes on the piston. Somebody else might look at this and think “I’m putting the piston lined up with the air jet so the air jet blows right on to the piston.” They might well extract MORE energy from the motion of the piston then the person who did the thermo calculation and placed their piston out of direct air flow. I think in every sense, the person exploiting the direct air flow from the air jet is building his super thermo machine exploiting his detailed knowledge of the state of the air in the chamber. I believe THIS is a picture you should have in your mind as you read all this stuff about Bayesian probability and entropy in the mind. And my comment on it is this: there are plenty of machines that are non-thermo. Thermo applies to steam engines and internal combustion engines when the working fluids thermalize faster than the mechanical components move. But a bicycle, being pumped by your legs, is not a thermo machine. THere is some quite non-thermalized chemistry going on in your muscles that causes motions of the pedals and gears that are MUCH higher than any local temperatures would predict, and which do interesting things on a MUCH faster time scale than the energy involved can leak out and thermalize the rest of the system.
THere is no special “in the mind” component of this non-thermo-equilibrium air jet machine. Anybody who sees the machine I have built where the air jet blows directly on the piston, who analyzes the machine, will calculate the same performance of the machine if they have the same gas-dynamics simulation code that I have. THey will recognize that this machine is not using a thermalized volume of gas to press the piston, that it is using a very not-in-equilibrium stream of fast gas to push the piston harder.
In conclusion: the kinds of special knowledge invoked to make Entropy an “in your mind” quantity are really going beyond the traditional objective definition of Entropy and just failing to give this new different quantity a a new different name. This represents an equivocation, not a subjective component to entropy, just as someone changing the definition of apple to include oranges is not proving the subjectivity of the concept of Apple, they are simply using words differently than the people they are talking to and forgetting to mention that.
Further, the particular “special knowledge of details” physics discussed is not anything new. It is mechanics. Thermodynamics is a subclass of mechanics useful for analyzing system dynamics where fluids interact internally much faster than they act on the pieces of the machine they are pushing. In these cases thermodynamic calculations. But when the details of the system are that it is NOT in thermodynamic equilibrium as it interacts with the moving parts of a machine, this does not make entropy subjective, it makes entropy a more difficult to use tool in the analysis, just as an Apple pealer is not so useful to a guy who thinks oranges are a kind of apple.
Finally, there is an intermediate realm of mechanics where fluids are used and they are partially thermalized, but not completely because the dynamics of the rest of the machine are comparable to thermalization times. There might be interesting extensions from the concepts of entropy that could be useful in calculating the dynamics of these systems. But the fact that only one of two minds in a room is thinking these thoughts at a given moment does not make either the original entropy concept or these new extensions any more “in the mind” then is Energy. It just means the two minds need to each understand this new physics for this intermediate case but when they do they will be using unambiguous definitions for “prompt entropy” or whatever they call it.
THank you.
The thing that leaps out at me is that the rhetorical equation in that article between the sexiness of a woman being in the mind and the probability of two male children being in the mind is bogus.
I look at a woman and think she is sexy. If I assume the sexiness is in the woman, and that an alien creature would think she is sexy, or my wife would think she is sexy, because they would see the sexiness in her, then the article claims I have been guilty of the mind projection fallacy because the woman’s sexiness is in my mind, not in the woman.
The article then proceeds to enumerate a few situations in which I am given incomplete information about reality and each different scenario corresponds to a different estimate that a person has two boy children.
BUT… it seems to me, and I would love to know if Eliezer himself would agree, even an alien given the same partial information would, if it were rational and intelligent, reach the same conclusions about the probabilities involved! So… probability, even Bayesian probability based on uncertainty is no more or less in my head than is 1+1=2. 1+1=2 whether I am an Alien mind or a Human mind, unlike that woman is sexy which may only be true in heterosexual male, homosexual female, and bisexual human minds, but not Alien minds.
But be that as it may, your comment still ignores the entire discussion, which is is Entropy and more or less “real” than Energy? The fact is that Aliens who had steam engines, internal combustion engines, gas turbines, and air conditioners would almost certainly have thermodynamics, and understand entropy, and agree with Humans on the laws of thermodynamics and the trajectories of entropy in the various machines.
If Bayesian probability is in the mind, and Entropy is in the mind, then they are like 1+1=2 being in the mind, things which would be in the mind of anything which we considered rational or intelligent. They would NOT be like “sexiness.”
Probability depends on state of knowledge, which is a fact about your mind. Another agent with the same state of knowledge will assign the same probabilities. Another agent fully aware of your state of knowledge will be able to say what probabilities you should be assigning.
Sexiness depends on sexual preferences, which are a fact about your mind. Another agent with the same sexual preferences will assess sexiness the same way. Another agent fully aware of your sexual preferences will be able to say how sexy you will find someone.
I don’t see that there’s a big difference here. Except maybe for the fact that “states of knowledge”, unlike “sexual preferences”, can (in principle) be ranked: it’s just plain better for your state of knowledge to be more accurate.
Well yes. Of course everything you can say about probability and sexiness you can say about Energy, Entropy, and Apple. That is, the estimate of the energy or entropy relationships in a particular machine or experimental scenario depend on the equations for energy and entropy, the measurements you make on the system to find the values of the elements that go into those equations. Any mind with the same information will reach the same conclusions about the Energy and Entropy that you would, assuming you are all doing it “right.” Any intelligence desiring to transform heat producing processes into mechanical or electrical energy will even discover the same relationships to calculate energy and entropy as any other intelligence and will build similar machines, machines that would not be too hard for technologists from the other civilization to understand.
Even determining if something is an apple. Any set of intelligences that know the definitions of apples common among humans on earth will be able to look at various earth objects and determine which of them are apples, which are not, and which are borderline. (I’m imagining there must be some “crabapples” that are marginally edible that people would argue over whether to call apples or not, as well as a hybrid between an apple and a pear that some would call an apple and some wouldn’t).
So “Apple” “Sexy” “Entropy” “Energy” and “Probability” are all EQUALLY in the mind of the intelligence dealing with them.
If you check, you will see this discussion started by suggesting that Energy was “realer” than Entropy. That Entropy was more like Probability and Sexiness, and thus, not as real, while Energy was somehow actually “out there” and therefore realer.
My contention is that all these terms are equally as much in the mind as in reality, that as you say any intelligence who knows the definitions will come up with the same conclusions about any given real situation, and that there is no distinction in “realness” between Energy and Entropy, no distinction between these and Apple, and indeed no distinction between any of these and “Bayesian Probability.” That pointing out that features of the map are not features of the territory does NOT allow you to privilege some descriptive terms as being “really” part of the territory after all, even though they are words that can and should obviously be written down on the map.
If you are going to explicate further, please state whether you agree or disagree that some of these terms are realer than others, as this is how the thread started and open-ended explication is ambiguous.
Anything at all is “in the mind” in the sense that different people might for whatever reason choose to define the words differently. Because this applies to everything, it’s not terribly interesting and usually we don’t bother to state it. “Apple” and “energy” are “in the mind” in this sense.
But (in principle) someone could give you a definition of “energy” that makes no reference to your opinions or feelings or health or anything else about you, and be confident that you or anyone else could use that definition to evaluate the “energy” of a wide variety of systems and all converge on the same answer as your knowledge and skill grows.
“Entropy” (in the “log of number of possibilities” sense) and “probability” are “in the mind” in another, stronger sense. A good, universally applicable definition of “probability” needs to take into account what the person whose probability it is already knows. Of course one can define “probability, given everything there is to know about mwengler’s background information on such-and-such an occasion” and everyone will (in principle) agree about that, but it’s an interesting figure primarily for mwengler on that occasion and not really for anyone else. (Unlike the situation for “energy”.) And presumably it’s true that for all (reasonable) agents, as their knowledge and skill grow, they will converge on the same probability-relative-to-that-knowledge for any given proposition—but frequently that won’t in any useful sense be “the probability that it’s true”, it’ll be either 0 or 1 depending on whether the proposition turns out to be true or false. For propositions about the future (assuming that we fix when the probability is evaluated) is might end up being something neither 0 nor 1 for quantum-mechanical reasons, but that’s a special case.
Similarly, entropy in the “log of number of possibilities” sense is meaningful only for an agent with given knowledge. (There is probably a reasonably respectable way of saying “relative to what one could find out by macroscopic observation, not examining the system too closely”, and I think that’s often what “entropy” is taken to mean, and that’s fine. But that isn’t quite the meaning that’s being advocated for in this post.)
Sexiness is “in the mind” in an even stronger sense, I suppose. But I think it’s reasonable to say that on the scale from “energy” to “sexiness”, probability is a fair fraction of the way towards “sexiness”.
Aha! So it would seem the original sense that “Energy” is “realer” (more like Apple) than Entropy is because Entropy is associated with Probability, and Bayesian Probability, the local favorite, is more in the mind than other things because its accurate estimation requires information about the state of knowledge of the person estimating it.
So it is proposed there is a spectrum “in the mind” (or dependent on other things in the mind as well as things in the real world) to “real” (or in the mind only to the extent that it depends on definitions all minds would tend to share).
We have Sexiness is in the mind, and thinking it is in reality is a projection fallacy. At the other end of the spectrum, we have things like Energy and Apple which are barely in the mind, which depend in straightforward ways on straightforward observations of reality, and would be agreed upon by all minds that agreed on the definitions.
And then we have probability. Frequentist definitions of probability are intended to be like Energy and Apple, relatively straightforward to calculate from easy to define observations.
But then we have Bayesian probability, which is a statement which links our current knowledge of various details with our estimate of probability. So considering that different minds can have different bits of other knowledge in them than other minds, different minds can “correctly” estimate different probabilities for the same occurrences, just as different minds can estimate different amounts of sexiness for the same creatures, depending on the species and genders of the different minds.
And then we have Entropy. And somebody defines Entropy as the “log of number of possibilities” and possibilities are like probabilities, and we prefer Bayesian “in the mind” probability to Frequentist “in reality” definitions of probability. And so some people think Entropy might be in the mind like Bayesian probability and sexiness, rather than in reality like Energy and Apple.
Good summary? I know! It is!
So here is the thing. Entropy in physics is defined as
That is, the entropy is very deterministically added to a system by heating the system with an unambiguously determined amount of energy dQrev, and dividing that amount of energy by an unambiguously determined temperature of the system. That sure doesn’t look like it has any probabilities in it. So THIS definition of Entropy is as real as Energy and Apple. And this is where I have been coming from. You me and an alien from Alpha Centauri can all learn the thermodynamics required to build steam engines, internal combustion engines, and refrigerators, and we will all find the same definitions for Energy and Entropy (however we might name them), and we will all determine the same trajectories in time and space for Energies and Entropies for any given thermodynamic system we analyze. Entropy defined this way is as real as Energy and Apples.But what about that “log of number of possibilities” thing? Well a more pedantic answer would be, that the number of possibilities has nothing to do with probabilities. I have a multiparticle state with known physics of interactions. Its state when first specified, the possibility it initially occupies, has a certain amount of energy associated with it. The energy (we consider only closed systems for now) will stay constant, and EVERY possible point in parameter space which has the SAME energy as our initial state shows up on our list of possibilities for the system, and every point in parameter space with a DIFFERENT energy than our initial state is NOT a possible state of this system.
So counting the possibilities does NOT seem to involve any Bayesian probabilities at all. You, me, and an alien from Alpha Centauri who all look at the same system all come up with the same Entropy curves, just as we all come up with the same energy curves.
But perhaps I can do better than this. Tie this in to the intuition that entropy has something to do with probabilities. And I can.
The probabilities that entropy has to do with are FREQUENTIST probabilities. Enumerations of the physically possible states of the system. We could estimate them mathematically by hypothesizing a map of the system called parameter space, or we could take 10^30 snapshots of the physical system spread out over many millenia and just observe all the states the system gets into. Of course this second is impractical, but when has impractical ever stopped a lesswrong discussion?
So the real reason Entropy, Energy and Apple are “real” even though Bayesian Probability like Sexiness is “in the mind” is because Entropy is unambiguously defined for physical systems in terms of other unambiguous physical quantities “Energy” and “Temperature.” (BTW, Temperature is Average Kinetic Energy of the particles, not some ooky “in the mind” mind thing. Or for simplicity, define temperature as what the thermometer tells you.)
And to the extent you love Bayesian probability so much that you want somehow to interpret a list of states in parameter space that all have the same energy as somehow “in the mind,” you just need to realize that a frequentist interpretation of probability is more appropriate for any discussion of entropy than is a bayesian one: we use entropy to calculate what systems we know “enough” about will do, not to estimate how different people in different states of ignorance will bet on what they will do. If we enumerate the states wrong we get the wrong entropy and our engine doesn’t work the way we said it would, we don’t get to be right, in the subjective sense that our estimate was as good as it could be given what we knew.
I hope this is clear enough to be meaningful to anybody following this topic. It sure explains to me what has been going on.
That is one definition. It is not the only viable way to define entropy. (As you clearly know.) The recent LW post on entropy that (unless I’m confused) gives the background for this discussion defines it differently, and gives the author’s reasons for preferring that definition.
(I am, I take it like you, not convinced that the author’s reasons are cogent enough to justify the claim that the probabilistic definition of entropy is the only right one and that the thermodynamic definition is wrong. If I have given a different impression, then I have screwed up and I’m sorry.)
“Log of #possibilities” doesn’t have any probabilities in it, but only because it’s a deliberate simplification, targetting the case where all the probabilities are roughly equal (which turns out not to be a bad approximation because there are theorems that say most states have roughly equal probability and you don’t go far wrong by pretending those are the only ones and they’re all equiprobable). The actual definition, of course, is the “—sum of p log p” one, which does have probabilities in it.
So, the central question at issue—I think—is whether it is an error to apply the “—sum of p log p” definition of entropy when the probabilities you’re working with are of the Bayesian rather than the frequentist sort; that is, when rather than naively counting states and treating them all as equiprobable you adjust according to whatever knowledge you have about the system. Well, of course you can always (in principle) do the calculation; the questions are (1) is the quantity you compute in this way of any physical relevance? and (2) is it appropriate to call it “entropy”?
Now, for sure your state of knowledge of a system doesn’t affect the behaviour of a heat engine constructed without the benefit of that knowledge. If you want to predict its behaviour, then (this is a handwavy way of speaking, but I like it) the background knowledge you need to apply when computing probabilities is what’s “known” by the engine. And of course you end up with ordinary thermodynamic entropy. (I am fairly sure no one who has been talking about entropy on LW recently would disagree.)
But suppose you know enough about the details of a system that the entropy calculated on the basis of your knowledge is appreciably different from the thermodynamic entropy; that is, you have extra information about which of its many similar-looking equal-energy states it’s more likely to be in. Then (in principle, as always) you can construct an engine that extracts more energy from the system than you would expect from the usual thermodynamic calculations.
Does this make this “Bayesian entropy” an interesting quantity and justify calling it entropy? I think so, even though in almost all real situations it’s indistinguishable from the thermodynamic entropy. If you start out with only macroscopic information, then barring miracles you’re not going to improve that situation. But it seems to me that this notion of entropy may make for a simpler treatment of some non-equilibrium situations. Say you have a box with a partition in it, gas on one side and vacuum on the other. Now you remove the partition. You briefly have extra information about the state of what’s in the box beyond what knowing the temperature, volume and pressure gives you, and indeed you can exploit that to extract energy even if once the gas settles down its temperature is the same as that of its environment. I confess I haven’t actually done the calculations to verify that the “Bayesian” approach actually leads to the right answers; if (as I expect) it does, or can be adjusted in a principled way so that it does, then this seems like a nice way of unifying the equilibrium case (where you talk about temperature and entropy) and the non-equilibrium case (where you have to do something more resembling mechanics to figure out what energy you can extract and how). And—though here I may just be displaying my ignorance—I don’t see how you answer questions like “10ms after the partition is removed, once the gas has started flowing into the previously empty space, but isn’t uniformly spread out yet, what’s the entropy of the system?” without something resembling the Bayesian approach, at least to the extent of not assuming all microstates are equally probable.
[EDITED to add: I see you’ve already commented on the “extracting energy from a thermodynamically hot thing whose microstate is known” thing, your answer being that the machine you do it with needs to be very cold and that explains how you get energy out. But I haven’t understood why the machine has to be very cold. Isn’t it, in fact, likely to have lots of bits moving very fast to match up somehow with the molecules it’s exploiting? That would make it hot according to the thermodynamic definition of temperature. I suppose you might argue that it’s really cold because its state is tightly controlled—but that would be the exact same argument that you reject when it’s applied to the hot thing the machine is exploiting its knowledge of.]
OK this is in fact interesting. In an important sense you have already won, or I have learned something, whichever description you find less objectionable.
I still think that the real definition of entropy is as you originally said, the log of the number of allowable states, where allowable means “at the same total energy as the starting state has.” To the extent entropy is then used to calculate the dynamics of a system, this unambiguous definition will apply when the system moves smoothly and slowly from one thermal equilibrium to another, as some macrosopic component of the system changes “slowly,” slowly enough that all intermediate steps look like thermal equilibria, also known in the trade as “reversibly.”
But your “10 ms after the partition removed” statement highlights that the kinds of dynamics you are thinking of are not reversible, not the dynamics of systems in thermal equilibrium. Soon after the partition is removed, you have a region that used to be vacuum that has only fast moving molecules in it, the slow moving ones from the distribution haven’t had time to get there yet! Soon after that when the fast molecules are first reaching the far wall, you have some interesting mixing going on involving fast molecules bouncing off the wall and hitting slower molecules still heading towards the wall. And in a frame by frame sense, and so on and so on.
Eventually (seconds? Less?) zillions (that’s a technical term) of collisions have occurred and the distributions of molecular speeds in any small region of the large volume is a thermal distribution, at a lower temperature than the original distribution before the partition was removed (gasses cool on expansion). But the details of how the system got to this new equilibrium are lost. The system has thermalized, come to a new thermal equilibrium.
I would still maintain that formally, the log of the number of states is a fine definition, that the entropy thus defined is as unambiguous as “Energy,” and that it is as useful as energy.
If you start modifying the “entropy,” if you start counting some states more than others, there are two reasons that might make sense. 1) you are interested in non-thermal-equilibrium dynamics, and given a particular starting state for the system, you want to count only the parameter states the system could reach in some particular short time frame, or 2) you are equivocating, pretending that your more complete knowledge of the starting point of the system than someone else had gives your entropy calculation an “in your mind” component when all it does is mean at least one of the minds making the calculation was making it for a different system than the one in front of them.
In the case of 1) non-equilibrium dynamics is certainly a reasonable thing to be interested in. However, the utility of the previously and unambiguously defined entropy in calculating the dynamics of systems which reach thermal equilibrium is so high that it really is up to those who would modify it, to modify the name describing it as well. So the entropy-like calculation that counts only states reachable after 10 ms might be called the “prompt entropy” or the “evolving entropy.” It really isn’t reasonable to just call it entropy and then claim an “in your mind” component to the property of entropy, because in your mind you are actually doing something different from what everybody else is doing.
In the case of 2), where you look in detail at the system and see a different set of states the system can get to than someone else who looked at the system saw, then it is not a matter of entropy being in your mind that distinguishes, it is a situation of one of you being wrong about what the entropy is. And my calling an orange “an Apple” no more makes Apple ambiguous than my saying 2+2=5 calls into question the objective truth of addition.
As to the machine that subtracts extra energy… Consider an air jet blowing a stream of high pressure air into a chamber with a piston in it. THe piston can move and you can extract energy. Someone using thermo to build an engine based on this might just calculate the rise in pressure in the volume as the air jet blows into it, and put the piston in a place where the air jet is not blowing directly on to it, and might then find their machine performs in a way you would expect from a thermo calculation. I.e. they might build their machine so the energy from air jet is “thermalized” with the rest of the air in the volume before it pushes on the piston. Somebody else might look at this and think “I’m putting the piston lined up with the air jet so the air jet blows right on to the piston.” They might well extract MORE energy from the motion of the piston then the person who did the thermo calculation and placed their piston out of direct air flow. I think in every sense, the person exploiting the direct air flow from the air jet is building his super thermo machine exploiting his detailed knowledge of the state of the air in the chamber. I believe THIS is a picture you should have in your mind as you read all this stuff about Bayesian probability and entropy in the mind. And my comment on it is this: there are plenty of machines that are non-thermo. Thermo applies to steam engines and internal combustion engines when the working fluids thermalize faster than the mechanical components move. But a bicycle, being pumped by your legs, is not a thermo machine. THere is some quite non-thermalized chemistry going on in your muscles that causes motions of the pedals and gears that are MUCH higher than any local temperatures would predict, and which do interesting things on a MUCH faster time scale than the energy involved can leak out and thermalize the rest of the system.
THere is no special “in the mind” component of this non-thermo-equilibrium air jet machine. Anybody who sees the machine I have built where the air jet blows directly on the piston, who analyzes the machine, will calculate the same performance of the machine if they have the same gas-dynamics simulation code that I have. THey will recognize that this machine is not using a thermalized volume of gas to press the piston, that it is using a very not-in-equilibrium stream of fast gas to push the piston harder.
In conclusion: the kinds of special knowledge invoked to make Entropy an “in your mind” quantity are really going beyond the traditional objective definition of Entropy and just failing to give this new different quantity a a new different name. This represents an equivocation, not a subjective component to entropy, just as someone changing the definition of apple to include oranges is not proving the subjectivity of the concept of Apple, they are simply using words differently than the people they are talking to and forgetting to mention that.
Further, the particular “special knowledge of details” physics discussed is not anything new. It is mechanics. Thermodynamics is a subclass of mechanics useful for analyzing system dynamics where fluids interact internally much faster than they act on the pieces of the machine they are pushing. In these cases thermodynamic calculations. But when the details of the system are that it is NOT in thermodynamic equilibrium as it interacts with the moving parts of a machine, this does not make entropy subjective, it makes entropy a more difficult to use tool in the analysis, just as an Apple pealer is not so useful to a guy who thinks oranges are a kind of apple.
Finally, there is an intermediate realm of mechanics where fluids are used and they are partially thermalized, but not completely because the dynamics of the rest of the machine are comparable to thermalization times. There might be interesting extensions from the concepts of entropy that could be useful in calculating the dynamics of these systems. But the fact that only one of two minds in a room is thinking these thoughts at a given moment does not make either the original entropy concept or these new extensions any more “in the mind” then is Energy. It just means the two minds need to each understand this new physics for this intermediate case but when they do they will be using unambiguous definitions for “prompt entropy” or whatever they call it.