Is Humbali right that generic uncertainty about maybe being wrong, without other extra premises, should increase the entropy of one’s probability distribution over AGI, thereby moving out its median further away in time?
I’ll add a quick answer: my gut says technically true, but that mostly I should just look at the arguments because they provide more weight than the prior. Strong evidence is common. It seems plausible to me that the prior over ‘number of years away’ should make me predict it’s more like 10 trillion years away or something, but that getting to observe the humans and the industrial revolution has already moved me to “likely in the next one thousand years” such that remembering this prior isn’t very informative any more.
My answer: technically true but practically irrelevant.
Hmm, alas, stopped reading too soon.
I’ll add a quick answer: my gut says technically true, but that mostly I should just look at the arguments because they provide more weight than the prior. Strong evidence is common. It seems plausible to me that the prior over ‘number of years away’ should make me predict it’s more like 10 trillion years away or something, but that getting to observe the humans and the industrial revolution has already moved me to “likely in the next one thousand years” such that remembering this prior isn’t very informative any more.
My answer: technically true but practically irrelevant.