In probability theory, you have outcomes (individual possibilities), events (sets of possibilities), and distributions (assignments of probabilities to all possible outcomes).
“microstate”: outcome.
“macrostate”: sorta ambiguous between event and distribution.
“entropy of an outcome”: not a thing working scientists or mathematicians say, ever, as far as I know.
“entropy of an event”: not a thing either.
“entropy of a distribution”: that’s a thing!
“entropy of a macrostate”: people say this, so they must mean a distribution when they are saying this phrase.
I think you’re within your rights to use “macrostate” in any reasonable way that you like. My beef is entirely about the type signature of “entropy” with regard to distributions and events/outcomes.
I think it might be a bit of a mess, tbh.
In probability theory, you have outcomes (individual possibilities), events (sets of possibilities), and distributions (assignments of probabilities to all possible outcomes).
“microstate”: outcome.
“macrostate”: sorta ambiguous between event and distribution.
“entropy of an outcome”: not a thing working scientists or mathematicians say, ever, as far as I know.
“entropy of an event”: not a thing either.
“entropy of a distribution”: that’s a thing!
“entropy of a macrostate”: people say this, so they must mean a distribution when they are saying this phrase.
I think you’re within your rights to use “macrostate” in any reasonable way that you like. My beef is entirely about the type signature of “entropy” with regard to distributions and events/outcomes.