Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.
Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.