add absurdities—where the absurdity is the log probability, so you can add it—rather than averaging them.
This is a very nice measure (which I’ve seen before) and term for it (which I have not seen).
Eliezer, did you develop this yourself? Should I say to other people ‘Artificial-intelligence researcher Eliezer Yudkowsky defines the absurdity of a proposition to be the opposite of the logarithm of its probability, A = –log P.’ as an introduction before I start to use it? (I threw in a minus sign so that higher probability would be lower absurdity; maybe you were taking the logarithm base 1⁄2 so you didn’t have to do that.)
This is a very nice measure (which I’ve seen before) and term for it (which I have not seen).
Eliezer, did you develop this yourself? Should I say to other people ‘Artificial-intelligence researcher Eliezer Yudkowsky defines the absurdity of a proposition to be the opposite of the logarithm of its probability, A = –log P.’ as an introduction before I start to use it? (I threw in a minus sign so that higher probability would be lower absurdity; maybe you were taking the logarithm base 1⁄2 so you didn’t have to do that.)
Thirteen years later I come to point out that this would make the entropy of a distribution its expected absurdity, which actually feels deep somehow.