But at each step, you discover another step you didn’t originally anticipate, and had no priors for anticipating.
if this is the case, your issue is unrelated to sunk cost, it is the Planning Fallacy. You’ve failed to perform the proper risk analysis and mitigation. The excuse “had no priors for anticipating” is only valid for black swan events, but not for your run-of-the-mill problems every project has.
So, when faced with the situation you describe, one should stop barking up the wrong tree and and do a pre-mortem.
if this is the case, your issue is unrelated to sunk cost, it is the Planning Fallacy. You’ve failed to perform the proper risk analysis and mitigation. The excuse “had no priors for anticipating” is only valid for black swan events, but not for your run-of-the-mill problems every project has.
Assume you have performed the proper risk analysis and mitigation.
So, when faced with the situation you describe, one should stop barking up the wrong tree and and do a pre-mortem.
Assume you’ve done this and it has failed to prevent the issues described. What now?
If you are a rational agent and encounter this, then the proper action is to keep going, since the fact that you haven’t gotten any closer to your goal is just an unlikely coincidence. In real life it’s much more likely that you’re just committing the planning fallacy, which is why someone reading this will assume that you’ll keep noticing steps you missed instead of actually being right this time.
If you are a rational agent and encounter this, then the proper action is to keep going, since the fact that you haven’t gotten any closer to your goal is just an unlikely coincidence. In real life it’s much more likely that you’re just committing the planning fallacy, which is why someone reading this will assume that you’ll keep noticing steps you missed instead of actually being right this time.
You keep going. The situation keeps getting worse. You’ve now spent five times the original estimate, three times as much as the project is worth to you, and you’ve inflated the last set of tasks by 1000x their original estimate, which is four times as much as your now absurdly updated probability distribution says they’ll take. It’s still worth doing. What now?
It’s sort of like asking: a coin lands on heads twenty consecutive times. Do you keep betting tails at even odds? By the way, the coin is fair.
You’re giving me massive amounts of evidence that I’m falling prey to the planning fallacy, giving the impression that I’m falling prey to the planning fallacy, and trying to get me to take the appropriate action given that I’m falling prey to the planning fallacy, but you’re telling me that I’m not falling prey to the planning fallacy. So which is it? Because if you’re really not falling prey to it, and this really is a coincidence, then you shouldn’t give up, because this time you’ll be right. I know it sounds unlikely, but that’s your fault for picking an unlikely premise. This is to statistics what the trolley problem is to morality.
Are you feeling like you’re caught in a rationality penny auction yet? That the premise is setting you up to lose no matter what you do except walk away from the problem?
You’re right that you’re not falling prey to the Planning Fallacy. You’re wrong that it’s a coincidence.
The coin is fair. The universe containing the coin is not.
That is, I’m deliberately setting the scenario up so that it’s unwinnable. There is a hidden evil god who is running an unwinnable hypothetical, shifting the goalposts as soon as you near them. What’s the difference between a hypothetical in which there is an evil god and one in which there isn’t, given that you seem to be perpetually unlucky? Nothing tangible to you.
You and a partner are working on the same thing, and both expect the same reward (you can imagine it’s putting up a fence between your partner-neighbor’s yard and your own), which has an expected reward of about 25% above the effort you put into it; you gotten to the three-quarters mark, but now you’re doing all their work, owing to some claimed emergency (they claim they hurt their back, say) on their part. What’s the difference between the world in which you’ve fallen prey to the Planning Fallacy, and the world in which they’re exploiting your unwillingness to drop the project now? Nothing tangible to you. Either way, you’re breaking even on a project you expected to make you better-off, and they’re doubling their effort investment.
The Planning Fallacy is a red herring here. The idea of predicting how much longer it will take, in general, is a red herring. It’s all distracting you from a more central problem: avoidance of the Sunk Cost Fallacy makes you vulnerable to a mugging.
Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.
if this is the case, your issue is unrelated to sunk cost, it is the Planning Fallacy. You’ve failed to perform the proper risk analysis and mitigation. The excuse “had no priors for anticipating” is only valid for black swan events, but not for your run-of-the-mill problems every project has.
So, when faced with the situation you describe, one should stop barking up the wrong tree and and do a pre-mortem.
Assume you have performed the proper risk analysis and mitigation.
Assume you’ve done this and it has failed to prevent the issues described. What now?
Sorry, your assumptions are untenable for the reasons I described. So, not an interesting hypothetical.
If you are a rational agent and encounter this, then the proper action is to keep going, since the fact that you haven’t gotten any closer to your goal is just an unlikely coincidence. In real life it’s much more likely that you’re just committing the planning fallacy, which is why someone reading this will assume that you’ll keep noticing steps you missed instead of actually being right this time.
You keep going. The situation keeps getting worse. You’ve now spent five times the original estimate, three times as much as the project is worth to you, and you’ve inflated the last set of tasks by 1000x their original estimate, which is four times as much as your now absurdly updated probability distribution says they’ll take. It’s still worth doing. What now?
It’s sort of like asking: a coin lands on heads twenty consecutive times. Do you keep betting tails at even odds? By the way, the coin is fair.
You’re giving me massive amounts of evidence that I’m falling prey to the planning fallacy, giving the impression that I’m falling prey to the planning fallacy, and trying to get me to take the appropriate action given that I’m falling prey to the planning fallacy, but you’re telling me that I’m not falling prey to the planning fallacy. So which is it? Because if you’re really not falling prey to it, and this really is a coincidence, then you shouldn’t give up, because this time you’ll be right. I know it sounds unlikely, but that’s your fault for picking an unlikely premise. This is to statistics what the trolley problem is to morality.
Are you feeling like you’re caught in a rationality penny auction yet? That the premise is setting you up to lose no matter what you do except walk away from the problem?
You’re right that you’re not falling prey to the Planning Fallacy. You’re wrong that it’s a coincidence.
The coin is fair. The universe containing the coin is not.
That is, I’m deliberately setting the scenario up so that it’s unwinnable. There is a hidden evil god who is running an unwinnable hypothetical, shifting the goalposts as soon as you near them. What’s the difference between a hypothetical in which there is an evil god and one in which there isn’t, given that you seem to be perpetually unlucky? Nothing tangible to you.
You and a partner are working on the same thing, and both expect the same reward (you can imagine it’s putting up a fence between your partner-neighbor’s yard and your own), which has an expected reward of about 25% above the effort you put into it; you gotten to the three-quarters mark, but now you’re doing all their work, owing to some claimed emergency (they claim they hurt their back, say) on their part. What’s the difference between the world in which you’ve fallen prey to the Planning Fallacy, and the world in which they’re exploiting your unwillingness to drop the project now? Nothing tangible to you. Either way, you’re breaking even on a project you expected to make you better-off, and they’re doubling their effort investment.
The Planning Fallacy is a red herring here. The idea of predicting how much longer it will take, in general, is a red herring. It’s all distracting you from a more central problem: avoidance of the Sunk Cost Fallacy makes you vulnerable to a mugging.
Realistically, our understanding of stuff is vague enough that there’s little reason to differentiate between planning fallacy and an evil god. We should notice that we’re failing more than we expected to and correct for it. We’re not intelligent enough to work out precisely why.
If you had a truly rational agent performing solomonoff induction, it would eventually notice the hidden evil god. It doesn’t need the sunk cost fallacy.
If someone keeps claiming they have emergencies, you should eventually notice that it isn’t just luck. They might be trying to mug you. They might be really accident prone. The correct response is the same either way.