If you commit to taking Left, then the predictor, if malevolent, can “mistakenly” “predict” that you’ll take Right, making you burn to death. Just like in the given scenario: “Whoops, a mistaken prediction! How unfortunate and improbable! Guess you have no choice but to kill yourself now, how sad…”
There absolutely is a better strategy: don’t knowingly choose to burn to death.
For the record, I read Nate’s comments again, and I now think of it like this:
To the extent that the predictor was accurate in her line of reasoning, then you left-boxing does NOT result in you slowly burning to death. It results in, well, the problem statement being wrong, because the following can’t all be true:
The predictor is accurate
The predictor predicts you right-box, and places the bomb in left
You left-box
And yes, apparently the predictor can be wrong, but I’d say, who even cares? The probability of the predictor being wrong is supposed to be virtually zero anyway (although as Nate notes, the problem description isn’t complete in that regard).
If you commit to taking Left, then the predictor, if malevolent, can “mistakenly” “predict” that you’ll take Right, making you burn to death. Just like in the given scenario: “Whoops, a mistaken prediction! How unfortunate and improbable! Guess you have no choice but to kill yourself now, how sad…”
There absolutely is a better strategy: don’t knowingly choose to burn to death.
We know the error rate of the predictor, so this point is moot.
I still have to see a strategy incorporating this that doesn’t overall lose by losing utility in other scenarios.
How do we know it? If the predictor is malevolent, then it can “err” as much as it wants.
For the record, I read Nate’s comments again, and I now think of it like this:
To the extent that the predictor was accurate in her line of reasoning, then you left-boxing does NOT result in you slowly burning to death. It results in, well, the problem statement being wrong, because the following can’t all be true:
The predictor is accurate
The predictor predicts you right-box, and places the bomb in left
You left-box
And yes, apparently the predictor can be wrong, but I’d say, who even cares? The probability of the predictor being wrong is supposed to be virtually zero anyway (although as Nate notes, the problem description isn’t complete in that regard).
We know it because it is given in the problem description, which you violate if the predictor ‘can “err” as much as it wants’.