Are you feeling like you’re caught in a rationality penny auction yet? That the premise is setting you up to lose no matter what you do except walk away from the problem?
You’re right that you’re not falling prey to the Planning Fallacy. You’re wrong that it’s a coincidence.
The coin is fair. The universe containing the coin is not.
That is, I’m deliberately setting the scenario up so that it’s unwinnable. There is a hidden evil god who is running an unwinnable hypothetical, shifting the goalposts as soon as you near them. What’s the difference between a hypothetical in which there is an evil god and one in which there isn’t, given that you seem to be perpetually unlucky? Nothing tangible to you.
You and a partner are working on the same thing, and both expect the same reward (you can imagine it’s putting up a fence between your partner-neighbor’s yard and your own), which has an expected reward of about 25% above the effort you put into it; you gotten to the three-quarters mark, but now you’re doing all their work, owing to some claimed emergency (they claim they hurt their back, say) on their part. What’s the difference between the world in which you’ve fallen prey to the Planning Fallacy, and the world in which they’re exploiting your unwillingness to drop the project now? Nothing tangible to you. Either way, you’re breaking even on a project you expected to make you better-off, and they’re doubling their effort investment.
The Planning Fallacy is a red herring here. The idea of predicting how much longer it will take, in general, is a red herring. It’s all distracting you from a more central problem: avoidance of the Sunk Cost Fallacy makes you vulnerable to a mugging.
Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.
Are you feeling like you’re caught in a rationality penny auction yet? That the premise is setting you up to lose no matter what you do except walk away from the problem?
You’re right that you’re not falling prey to the Planning Fallacy. You’re wrong that it’s a coincidence.
The coin is fair. The universe containing the coin is not.
That is, I’m deliberately setting the scenario up so that it’s unwinnable. There is a hidden evil god who is running an unwinnable hypothetical, shifting the goalposts as soon as you near them. What’s the difference between a hypothetical in which there is an evil god and one in which there isn’t, given that you seem to be perpetually unlucky? Nothing tangible to you.
You and a partner are working on the same thing, and both expect the same reward (you can imagine it’s putting up a fence between your partner-neighbor’s yard and your own), which has an expected reward of about 25% above the effort you put into it; you gotten to the three-quarters mark, but now you’re doing all their work, owing to some claimed emergency (they claim they hurt their back, say) on their part. What’s the difference between the world in which you’ve fallen prey to the Planning Fallacy, and the world in which they’re exploiting your unwillingness to drop the project now? Nothing tangible to you. Either way, you’re breaking even on a project you expected to make you better-off, and they’re doubling their effort investment.
The Planning Fallacy is a red herring here. The idea of predicting how much longer it will take, in general, is a red herring. It’s all distracting you from a more central problem: avoidance of the Sunk Cost Fallacy makes you vulnerable to a mugging.
Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.