haven’t got it backwards and a string of all ‘1’s has nearly zero entropy while a perfectly random string is ‘maximum entropy’.
Ugh. I realize you probably know what you are talking about, but I expect a category error like this is probably not going to help you explain it...
Edit: Actually, I suppose that sort of thing is not really a problem if they’re used to the convention where “a random X” means “a probability distribution over Xs”, but if you’re having to introduce information entropy I expect that’s probably not the case. The real problem is that the string of all 1s is a distractor—it will make people think the fact that it’s all 1s is relevant, rather than just the fact that it’s a fixed string.
Edit once again: Oh, did you mean Kolmogorov complexity? Then never mind. “Entropy” without qualification usually means Shannon entropy.
Ugh. I realize you probably know what you are talking about, but I expect a category error like this is probably not going to help you explain it...
Edit: Actually, I suppose that sort of thing is not really a problem if they’re used to the convention where “a random X” means “a probability distribution over Xs”, but if you’re having to introduce information entropy I expect that’s probably not the case. The real problem is that the string of all 1s is a distractor—it will make people think the fact that it’s all 1s is relevant, rather than just the fact that it’s a fixed string.
Edit once again: Oh, did you mean Kolmogorov complexity? Then never mind. “Entropy” without qualification usually means Shannon entropy.