One thing I’m not very confident about is how working scientists use the concept of “macrostate”. If I had good resources for that I might change some of how the sequence is written, because I don’t want to create any confusion for people who use this sequence to learn and then go on to work in a related field. (...That said, it’s not like people aren’t already confused. I kind of expect most working scientists to be confused about entropy outside their exact domain’s use.)
In probability theory, you have outcomes (individual possibilities), events (sets of possibilities), and distributions (assignments of probabilities to all possible outcomes).
“microstate”: outcome.
“macrostate”: sorta ambiguous between event and distribution.
“entropy of an outcome”: not a thing working scientists or mathematicians say, ever, as far as I know.
“entropy of an event”: not a thing either.
“entropy of a distribution”: that’s a thing!
“entropy of a macrostate”: people say this, so they must mean a distribution when they are saying this phrase.
I think you’re within your rights to use “macrostate” in any reasonable way that you like. My beef is entirely about the type signature of “entropy” with regard to distributions and events/outcomes.
One thing I’m not very confident about is how working scientists use the concept of “macrostate”. If I had good resources for that I might change some of how the sequence is written, because I don’t want to create any confusion for people who use this sequence to learn and then go on to work in a related field. (...That said, it’s not like people aren’t already confused. I kind of expect most working scientists to be confused about entropy outside their exact domain’s use.)
I think it might be a bit of a mess, tbh.
In probability theory, you have outcomes (individual possibilities), events (sets of possibilities), and distributions (assignments of probabilities to all possible outcomes).
“microstate”: outcome.
“macrostate”: sorta ambiguous between event and distribution.
“entropy of an outcome”: not a thing working scientists or mathematicians say, ever, as far as I know.
“entropy of an event”: not a thing either.
“entropy of a distribution”: that’s a thing!
“entropy of a macrostate”: people say this, so they must mean a distribution when they are saying this phrase.
I think you’re within your rights to use “macrostate” in any reasonable way that you like. My beef is entirely about the type signature of “entropy” with regard to distributions and events/outcomes.