If OP were an entropy, then we’d simply do a weighted sum 1/2(OP(X4)+OP(X7))=1/2(1+3)=2, and then add one extra bit of entropy to represent our (binary) uncertainty as to what state we were in, giving a total OP of 3.
I feel like you’re doing something wrong here. You’re mixing state distribution entropy with probability distribution entropy. If you introduce mixed states, shouldn’t each mixed state be accounted for in the phase space that you calculate the entropy over?
If you down the “entropy is ignorance about the exact microstate” route, this makes perfect sense. And various people have made convincing sounding arguments that this is the right way to see entropy, though I’m not expert myself.
I’m not an expert either. However, the OP function has nothing to do with ignorance or probabilities until you introduce them in the mixed states. It seems to me that this standard combining rule is not valid unless you’re combining probabilities.
I feel like you’re doing something wrong here. You’re mixing state distribution entropy with probability distribution entropy. If you introduce mixed states, shouldn’t each mixed state be accounted for in the phase space that you calculate the entropy over?
If you down the “entropy is ignorance about the exact microstate” route, this makes perfect sense. And various people have made convincing sounding arguments that this is the right way to see entropy, though I’m not expert myself.
I’m not an expert either. However, the OP function has nothing to do with ignorance or probabilities until you introduce them in the mixed states. It seems to me that this standard combining rule is not valid unless you’re combining probabilities.
Hence OP is not an entropy.