Now having read the rest of the essay… I guess “maximum entropy” is just straight up confusing if you don’t insert the ”...given assumptions XYZ”. Otherwise it sounds like there’s such a thing as “the maximum-entropy distribution”, which doesn’t exist: you have to cut up the possible worlds somehow, and different ways of cutting them up produces different uniform distributions. (Or in the continuous case, you have to choose a measure in order to do integration, and that measure contains just as much information as a probability distribution; the uniform measure says that all years are the same, but you could also say all orders of magnitude of time since the Big Bang are the same, or something else.) So how you cut up possible worlds changes the uniform distribution, i.e. the maximum entropy distribution. So the assumptions that go into how you cut up the worlds, are determining your maximum entropy distribution.
Hold on, I guess this actually means that for a natural interpretation of “entropy” in “generic uncertainty about maybe being wrong, without other extra premises, should increase the entropy of one’s probability distribution over AGI,” that statement is actually false. If by “entropy” we mean “entropy according to the uniform measure”, it’s false. What we should really mean is entropy according to one’s maximum entropy distribution (as the background measure), in which case the statement is true.
Now having read the rest of the essay… I guess “maximum entropy” is just straight up confusing if you don’t insert the ”...given assumptions XYZ”. Otherwise it sounds like there’s such a thing as “the maximum-entropy distribution”, which doesn’t exist: you have to cut up the possible worlds somehow, and different ways of cutting them up produces different uniform distributions. (Or in the continuous case, you have to choose a measure in order to do integration, and that measure contains just as much information as a probability distribution; the uniform measure says that all years are the same, but you could also say all orders of magnitude of time since the Big Bang are the same, or something else.) So how you cut up possible worlds changes the uniform distribution, i.e. the maximum entropy distribution. So the assumptions that go into how you cut up the worlds, are determining your maximum entropy distribution.
Hold on, I guess this actually means that for a natural interpretation of “entropy” in “generic uncertainty about maybe being wrong, without other extra premises, should increase the entropy of one’s probability distribution over AGI,” that statement is actually false. If by “entropy” we mean “entropy according to the uniform measure”, it’s false. What we should really mean is entropy according to one’s maximum entropy distribution (as the background measure), in which case the statement is true.