Eliezer’s reasoning in the blue tentacle situation is wrong. (This has long been obvious to me, but didn’t deserve its own post.) An explanation with high posterior probability conditioned on a highly improbable event doesn’t need to have high prior probability. So your ability to find the best available explanation for the blue tentacle after the fact doesn’t imply that you should’ve been noticeably afraid of it happening beforehand.
Also, if you accept the blue tentacle reasoning, why didn’t you apply it to all those puzzles with Omega?
You are right. I read it too long ago to remember enough details to revise the cached thought about the section’s content.
It’s wrong both formally, and for humans, since hypotheses can both have a large enough mass to pay rent, and be “fractal” enough to select nontrivial subsets from tiny improbable events.
If you have a random number generator that selects a random number of 100 digits, but it’s known to select odd numbers 100 times as often as even ones, then when you see a specific odd number, it’s an incredibly improbable event for that specific number to appear, and you have an explanation for why it’s odd.
The only valid message in that section was that the hindsight bias can distort ability to explain unlikely events.
Eliezer’s reasoning in the blue tentacle situation is wrong. (This has long been obvious to me, but didn’t deserve its own post.) An explanation with high posterior probability conditioned on a highly improbable event doesn’t need to have high prior probability. So your ability to find the best available explanation for the blue tentacle after the fact doesn’t imply that you should’ve been noticeably afraid of it happening beforehand.
Also, if you accept the blue tentacle reasoning, why didn’t you apply it to all those puzzles with Omega?
You are right. I read it too long ago to remember enough details to revise the cached thought about the section’s content.
It’s wrong both formally, and for humans, since hypotheses can both have a large enough mass to pay rent, and be “fractal” enough to select nontrivial subsets from tiny improbable events.
If you have a random number generator that selects a random number of 100 digits, but it’s known to select odd numbers 100 times as often as even ones, then when you see a specific odd number, it’s an incredibly improbable event for that specific number to appear, and you have an explanation for why it’s odd.
The only valid message in that section was that the hindsight bias can distort ability to explain unlikely events.