The post “I don’t know” is about refusing to assign probability distributions at all. That’s entirely different from refusing to assign an overly focused probability distribution when your epistemic state doesn’t actually provide you enough information to do so; the latter is the technical way to say “I don’t know” when you really don’t know. In this case I do recall Eliezer saying at some point (something like) that he spends about 50% of his planning effort on scenarios where the singularity happens before 2040(?) and about 50% on scenarios where it happens after 2040, so he clearly does have a probability distribution he’s working with, it’s just that the probability mass is spread pretty broadly.
I agree that “spread out probability mass” is a good technical replacement for “I don’t know.” Note that the more spread out it is, the less concentrated it is in the near future. That is, the less confident you are betting on this particular random variable (time until human extinction), the safer you should feel from human extinction.
“50% before 2040” doesn’t sound like such a high-entropy RV to me, though...
We seem to have caught Yudkowsky in a moment of hypocrisy: he doesn’t know when an intelligence explosion will occur.
The post “I don’t know” is about refusing to assign probability distributions at all. That’s entirely different from refusing to assign an overly focused probability distribution when your epistemic state doesn’t actually provide you enough information to do so; the latter is the technical way to say “I don’t know” when you really don’t know. In this case I do recall Eliezer saying at some point (something like) that he spends about 50% of his planning effort on scenarios where the singularity happens before 2040(?) and about 50% on scenarios where it happens after 2040, so he clearly does have a probability distribution he’s working with, it’s just that the probability mass is spread pretty broadly.
I agree that “spread out probability mass” is a good technical replacement for “I don’t know.” Note that the more spread out it is, the less concentrated it is in the near future. That is, the less confident you are betting on this particular random variable (time until human extinction), the safer you should feel from human extinction.
“50% before 2040” doesn’t sound like such a high-entropy RV to me, though...