The model is fine, what I’m having problems with is the whole “in the mind” business which goes straight to philosophy and seems completely unnecessary for the discussion of properties of classic systems in physics.
Entropy is statistical laws. Thus, like statistics, it’s in the mind. It’s also no more philosophical than statistics is, and not psychological at all.
Entropy is statistical laws. Thus, like statistics, it’s in the mind.
I have a feeling you’re confusing the map and the territory. Just because statistics (defined as a toolbox of methods for dealing with uncertainty) exists in the mind, there is no implication that uncertainty exists only in the mind as well. Half-life of a radioactive element is a statistical “thing” that exists in real life, not in the mind.
In the same way, phase changes of a material exist in the territory. You can usefully define temperature as a particular metric such that water turns into gas at 100 and turns into ice at zero. Granted, this approach has its limits but it does not seem to depend on being “in the mind”.
The half-life of a radioactive element is something that can be found without using probability. It is the time it takes for the measure of the universes in which the atom is still whole to be exactly half of the initial measure. Similarly, phase change can be defined without using probability.
The universe may be indeterministic (though I don’t think it is), but all this means is that the past is not sufficient to conclude the future. A mind that already knows the future (perhaps because it exists further in the future) would still know the future.
Even without references to MWI, I’m pretty sure you can just say the following: if at time t=0 you have an atom of carbon-14, at a later time t>0 you will have a superposition of carbon-14 and nitrogen-14 (with some extra stuff). The half-life is the value of t for which the two coefficients will be equal in absolute value.
Uncertainty in the mind and uncertainty in the territory are related, but they’re not the same thing, and calling them both “uncertainty” is misleading. If indeterminism is true, there is an upper limit to how certain someone can reliably be about the future, but someone further in the future can know it with perfect certainty and reliability.
If I ask if the billionth digit of pi is even or odd, most people would give even odds to those two things. But it’s something that you’d give even odds to on a bet, even in a deterministic universe.
If I flip a coin and it lands on heads, you’d be a fool to bet otherwise. It doesn’t matter if the universe is nondeterministic and you can prove that, given all the knowledge of the universe before the coin was flipped, it would be exactly equally likely to land on heads or tails. You know it landed on heads. It’s 100% certain.
Yes, future is uncertain but past is already fixed and certain. So? We are not talking about probabilities of something happening in the past. The topic of the discussion is how temperature (and/or probabilities) are “in the mind” and what does that mean.
The past is certain but the future is not. But the only difference between the two is when you are in relation to them. It’s not as if certain time periods are inherently past or future.
An example of temperature being in the mind that’s theoretically possible to set up but you’d never manage in practice is Maxwell’s demon. If you already know where all of the particles of gas are and how they’re bouncing, you could make it so all the fast ones end up in one chamber and all the slow ones end up in the other. Or you can just get all of the molecules into the same chamber. You can do this with an arbitrarily small amount of energy.
I think his “in the mind” is correct in his context, because in the model of entropy he is discussing, temperature_entropy is dependent on entropy, is dependent on your knowledge of the states of the system.
I’ll repeat what I said earlier in the context of the discussion of different theories of time.
Me, I think the people who identify exists_everydaymode with exists_spacetimemodel are just conceptually confused by their high falutin ideas. Exists_everydaymode didn’t cease to exist when we got our fancy new spacetime model to play with, and it’s relevance and functionality didn’t cease to exist either. “I have cancer” is really distinguishable in important ways to us from “I had cancer.”
New physics didn’t make old ideas useless. Temperature_kineticenergy is probably more relevant in most situations.
because they don’t know what temperature is
The OP makes his mistake by identifying temperature_entropy with temperature_kineticenergy.
I’m don’t see the issue in saying [you don’t know what temperature really is] to someone working with the definition [T = average kinetic energy]. One definition of temperature is always true. The other is only true for idealized objects.
Only one of them actually corresponds with temperature for all objects. They are both equal for one subclass of idealized objects, in which case the “average kinetic energy” definition follows from the the entropic definition, not the other way around. All I’m saying is that it’s worth emphasizing that one definition is strictly more general than the other.
Average kinetic energy always corresponds to average kinetic energy, and the amount of energy it takes to create a marginal amount of entropy always corresponds to the amount of energy it takes to create a marginal amount of entropy. Each definition corresponds perfectly to itself all of the time, and applies to the other in the case of idealized objects. How is one more general?
Two systems with the same “average kinetic energy” are not necessarily in equilibrium. Sometimes energy flows from a system with lower average kinetic energy to a system with higher average kinetic energy (eg. real gases with different degrees of freedom). Additionally “average kinetic energy” is not applicable at all to some systems, eg. ising magnet.
I just mean as definitions of temperature. There’s temperature(from kinetic energy) and temperature(from entropy). Temperature(from entropy) is a fundamental definition of temperature. Temperature(from kinetic energy) only tells you the actual temperature in certain circumstances.
Because one is true in all circumstances and the other isn’t? What are you actually objecting to? That physical theories can be more fundamental than each other?
I admit that some definitions can be better than others. A whale lives underwater, but that’s about the only thing it has in common with a fish, and it has everything else in common with a whale. You could still make a word to mean “animal that lives underwater”. There are cases where where it lives is so important that that alone is sufficient to make a word for it. If you met someone who used the word “fish” to mean “animal that lives underwater”, and used it in contexts where it was clear what it meant (like among other people who also used it that way), you might be able to convince them to change their definition, but you’d need a better argument than “my definition is always true, whereas yours is only true in the special case that the fish is not a mammal”.
The distinction here goes deeper than calling a whale a fish (I do agree with the content of the linked essay).
If a layperson asks me what temperature is, I’ll say something like, “It has to do with how energetic something is” or even “something’s tendency to burn you”. But I would never say “It’s the average kinetic energy of the translational degrees of freedom of the system” because they don’t know what most of those words mean. That latter definition is almost always used in the context of, essentially, undergraduate problem sets as a convenient fiction for approximating the real temperature of monatomic ideal gases—which, again, is usually a stepping stone to the thermodynamic definition of temperature as a partial derivative of entropy.
Alternatively, we could just have temperature(lay person) and temperature(precise). I will always insist on temperature(precise) being the entropic definition. And I have no problem with people choosing whatever definition they want for temperature(lay person) if it helps someone’s intuition along.
So, effectively there are two different things which go by the same name? Temperature_entropy is one measure (coming from the information-theoretic side) and temperature_kineticenergy is another measure (coming from, um, pre-Hamiltonian mechanics?)..?
That makes some sense, but then I have a question. If you take an ice cube out of the freezer and put it on a kitchen counter, will it melt if there is no one to watch it? In other words, how does the “temperature is in the mind” approach deal with phase transitions?
Temperature_kineticenergy is probably more relevant in most situations.
That’s difficult to say. If you build a heat pump, you deal with entropy. If you radiate waste heat, you deal with kinetic energy. If you want to know how much waste heat you’re going to have, you deal with entropy. If you significantly change the temperature of something with a heat pump, then you have to deal with both for a large variety of temperatures.
Calling them Temperature_kineticenergy and Temperature_entropy is somewhat misleading, since both involve kinetic energy. Temperature_kineticenergy is average kinetic energy, and Temperature_entropy is the change in kinetic energy necessary to cause a marginal increase in entropy.
Also, if you escape your underscores with backslashes, you won’t get the italics.
The model is fine, what I’m having problems with is the whole “in the mind” business which goes straight to philosophy and seems completely unnecessary for the discussion of properties of classic systems in physics.
Entropy is statistical laws. Thus, like statistics, it’s in the mind. It’s also no more philosophical than statistics is, and not psychological at all.
I have a feeling you’re confusing the map and the territory. Just because statistics (defined as a toolbox of methods for dealing with uncertainty) exists in the mind, there is no implication that uncertainty exists only in the mind as well. Half-life of a radioactive element is a statistical “thing” that exists in real life, not in the mind.
In the same way, phase changes of a material exist in the territory. You can usefully define temperature as a particular metric such that water turns into gas at 100 and turns into ice at zero. Granted, this approach has its limits but it does not seem to depend on being “in the mind”.
The half-life of a radioactive element is something that can be found without using probability. It is the time it takes for the measure of the universes in which the atom is still whole to be exactly half of the initial measure. Similarly, phase change can be defined without using probability.
The universe may be indeterministic (though I don’t think it is), but all this means is that the past is not sufficient to conclude the future. A mind that already knows the future (perhaps because it exists further in the future) would still know the future.
So, does your probability-less half-life require MWI? That’s not a good start. What happens if you are unwilling to just assume MWI?
Why do you think such a thing is possible?
Even without references to MWI, I’m pretty sure you can just say the following: if at time t=0 you have an atom of carbon-14, at a later time t>0 you will have a superposition of carbon-14 and nitrogen-14 (with some extra stuff). The half-life is the value of t for which the two coefficients will be equal in absolute value.
Uncertainty in the mind and uncertainty in the territory are related, but they’re not the same thing, and calling them both “uncertainty” is misleading. If indeterminism is true, there is an upper limit to how certain someone can reliably be about the future, but someone further in the future can know it with perfect certainty and reliability.
If I ask if the billionth digit of pi is even or odd, most people would give even odds to those two things. But it’s something that you’d give even odds to on a bet, even in a deterministic universe.
If I flip a coin and it lands on heads, you’d be a fool to bet otherwise. It doesn’t matter if the universe is nondeterministic and you can prove that, given all the knowledge of the universe before the coin was flipped, it would be exactly equally likely to land on heads or tails. You know it landed on heads. It’s 100% certain.
Yes, future is uncertain but past is already fixed and certain. So? We are not talking about probabilities of something happening in the past. The topic of the discussion is how temperature (and/or probabilities) are “in the mind” and what does that mean.
The past is certain but the future is not. But the only difference between the two is when you are in relation to them. It’s not as if certain time periods are inherently past or future.
An example of temperature being in the mind that’s theoretically possible to set up but you’d never manage in practice is Maxwell’s demon. If you already know where all of the particles of gas are and how they’re bouncing, you could make it so all the fast ones end up in one chamber and all the slow ones end up in the other. Or you can just get all of the molecules into the same chamber. You can do this with an arbitrarily small amount of energy.
I think his “in the mind” is correct in his context, because in the model of entropy he is discussing, temperature_entropy is dependent on entropy, is dependent on your knowledge of the states of the system.
I’ll repeat what I said earlier in the context of the discussion of different theories of time.
New physics didn’t make old ideas useless. Temperature_kineticenergy is probably more relevant in most situations.
The OP makes his mistake by identifying temperature_entropy with temperature_kineticenergy.
I’m don’t see the issue in saying [you don’t know what temperature really is] to someone working with the definition [T = average kinetic energy]. One definition of temperature is always true. The other is only true for idealized objects.
Nobody knows what anything really is. We have more or less accurate models.
What do you mean by “true”? They both can be expressed for any object. They are both equal for idealized objects.
Only one of them actually corresponds with temperature for all objects. They are both equal for one subclass of idealized objects, in which case the “average kinetic energy” definition follows from the the entropic definition, not the other way around. All I’m saying is that it’s worth emphasizing that one definition is strictly more general than the other.
Average kinetic energy always corresponds to average kinetic energy, and the amount of energy it takes to create a marginal amount of entropy always corresponds to the amount of energy it takes to create a marginal amount of entropy. Each definition corresponds perfectly to itself all of the time, and applies to the other in the case of idealized objects. How is one more general?
Two systems with the same “average kinetic energy” are not necessarily in equilibrium. Sometimes energy flows from a system with lower average kinetic energy to a system with higher average kinetic energy (eg. real gases with different degrees of freedom). Additionally “average kinetic energy” is not applicable at all to some systems, eg. ising magnet.
I just mean as definitions of temperature. There’s temperature(from kinetic energy) and temperature(from entropy). Temperature(from entropy) is a fundamental definition of temperature. Temperature(from kinetic energy) only tells you the actual temperature in certain circumstances.
Why is one definition more fundamental than another? Why is only one definition “actual”?
Because one is true in all circumstances and the other isn’t? What are you actually objecting to? That physical theories can be more fundamental than each other?
I admit that some definitions can be better than others. A whale lives underwater, but that’s about the only thing it has in common with a fish, and it has everything else in common with a whale. You could still make a word to mean “animal that lives underwater”. There are cases where where it lives is so important that that alone is sufficient to make a word for it. If you met someone who used the word “fish” to mean “animal that lives underwater”, and used it in contexts where it was clear what it meant (like among other people who also used it that way), you might be able to convince them to change their definition, but you’d need a better argument than “my definition is always true, whereas yours is only true in the special case that the fish is not a mammal”.
The distinction here goes deeper than calling a whale a fish (I do agree with the content of the linked essay).
If a layperson asks me what temperature is, I’ll say something like, “It has to do with how energetic something is” or even “something’s tendency to burn you”. But I would never say “It’s the average kinetic energy of the translational degrees of freedom of the system” because they don’t know what most of those words mean. That latter definition is almost always used in the context of, essentially, undergraduate problem sets as a convenient fiction for approximating the real temperature of monatomic ideal gases—which, again, is usually a stepping stone to the thermodynamic definition of temperature as a partial derivative of entropy.
Alternatively, we could just have temperature(lay person) and temperature(precise). I will always insist on temperature(precise) being the entropic definition. And I have no problem with people choosing whatever definition they want for temperature(lay person) if it helps someone’s intuition along.
So, effectively there are two different things which go by the same name? Temperature_entropy is one measure (coming from the information-theoretic side) and temperature_kineticenergy is another measure (coming from, um, pre-Hamiltonian mechanics?)..?
That makes some sense, but then I have a question. If you take an ice cube out of the freezer and put it on a kitchen counter, will it melt if there is no one to watch it? In other words, how does the “temperature is in the mind” approach deal with phase transitions?
They look like two different concepts to me.
I don’t know. I suppose that would depend on how much that mind knows about phase transitions.
That’s difficult to say. If you build a heat pump, you deal with entropy. If you radiate waste heat, you deal with kinetic energy. If you want to know how much waste heat you’re going to have, you deal with entropy. If you significantly change the temperature of something with a heat pump, then you have to deal with both for a large variety of temperatures.
Calling them Temperature_kineticenergy and Temperature_entropy is somewhat misleading, since both involve kinetic energy. Temperature_kineticenergy is average kinetic energy, and Temperature_entropy is the change in kinetic energy necessary to cause a marginal increase in entropy.
Also, if you escape your underscores with backslashes, you won’t get the italics.