Isn’t the probability of ending up in a real world situation where the entire world is in terrible danger and only you can save it vastly smaller than that of falsely perceiving such a situation? Despite that, I’m glad Petrov made his decision. Expected costs and benefits have to be considered, not just probabilities, but then you are back in normal decision theory or at least normal but not yet invented “decision theory for biased finite agents”.
Isn’t the probability of ending up in a real world situation where the entire world is in terrible danger and only you can save it vastly smaller than that of falsely perceiving such a situation? Despite that, I’m glad Petrov made his decision. Expected costs and benefits have to be considered, not just probabilities, but then you are back in normal decision theory or at least normal but not yet invented “decision theory for biased finite agents”.