John did ask about timescales and my answer was that I had no logical way of knowing the answer to that question and was reluctant to just make one up.
[...]
As for guessing the timescales, that actually seems to me much harder than guessing the qualitative answer to the question “Will an intelligence explosion occur?”
The post “I don’t know” is about refusing to assign probability distributions at all. That’s entirely different from refusing to assign an overly focused probability distribution when your epistemic state doesn’t actually provide you enough information to do so; the latter is the technical way to say “I don’t know” when you really don’t know. In this case I do recall Eliezer saying at some point (something like) that he spends about 50% of his planning effort on scenarios where the singularity happens before 2040(?) and about 50% on scenarios where it happens after 2040, so he clearly does have a probability distribution he’s working with, it’s just that the probability mass is spread pretty broadly.
I agree that “spread out probability mass” is a good technical replacement for “I don’t know.” Note that the more spread out it is, the less concentrated it is in the near future. That is, the less confident you are betting on this particular random variable (time until human extinction), the safer you should feel from human extinction.
“50% before 2040” doesn’t sound like such a high-entropy RV to me, though...
Eliezer Yudkowsky says:
[...]
We seem to have caught Yudkowsky in a moment of hypocrisy: he doesn’t know when an intelligence explosion will occur.
The post “I don’t know” is about refusing to assign probability distributions at all. That’s entirely different from refusing to assign an overly focused probability distribution when your epistemic state doesn’t actually provide you enough information to do so; the latter is the technical way to say “I don’t know” when you really don’t know. In this case I do recall Eliezer saying at some point (something like) that he spends about 50% of his planning effort on scenarios where the singularity happens before 2040(?) and about 50% on scenarios where it happens after 2040, so he clearly does have a probability distribution he’s working with, it’s just that the probability mass is spread pretty broadly.
I agree that “spread out probability mass” is a good technical replacement for “I don’t know.” Note that the more spread out it is, the less concentrated it is in the near future. That is, the less confident you are betting on this particular random variable (time until human extinction), the safer you should feel from human extinction.
“50% before 2040” doesn’t sound like such a high-entropy RV to me, though...