I’m happy to report that I have made the decision to wear seat belts without evaluating anything using probability. If the justification is really:
but even a small chance that it might prevent harm is generally thought to be worth the effort to put it on
Then you’re not explicitly assigning probabilities. Change ‘small chance’ to ‘5%’ and I’d wonder how you got that number, and what would happen if the chance were 4.99%.
How did you make the decision to wear seat belts then? If it is because you were taught to at a young age, or it is the law, then can you think of any safety precaution you take (or don’t take) because it prevents or mitigates a problem that you believe would have less than 50% chance of occurring any particular time you do not take the precaution?
Then you’re not explicitly assigning probabilities.
Often we make decisions based on our vague feelings of uncertainty, which are difficult to describe as a probability that could be communicated to others or explicitly analyzed mathematically. This difficulty is a failure of introspection, but the uncertainty we feel does somewhat approximate Bayesian probability theory. Many biases represent the limits of this approximation.
On things I care about, I find the best position I can and act as though I’m 100% certain of it. When another position is shown to be superior, I reject the original view entirely.
with the implicit assumption that “best positions” are about states of the world, and not synonymous with “best decisions”.
I guess we need to go back to Z. M. Davis’s last paragraph, reproduced here for your convenience:
I agree that it would be incredibly silly to try to explicitly calculate all your beliefs using probability theory. But a qualitative or implicit notion of probability does seem natural. You don’t have to think in terms of likelihood ratios to say things like “It’s probably going to rain today” or “I think I locked the door, but I’m not entirely sure.” Is this the sort of thing that you mean by the word judgment? In any case, even if bringing probability into the process isn’t helpful, bringing in this dichotomy between absolutely-certain-until-proven-otherwise and complete ignorance seems downright harmful. I mean, what do you do when you think you’ve locked the door, but you’re not entirely sure? Or does that just not happen to you?
I’m happy to report that I have made the decision to wear seat belts without evaluating anything using probability. If the justification is really:
Then you’re not explicitly assigning probabilities. Change ‘small chance’ to ‘5%’ and I’d wonder how you got that number, and what would happen if the chance were 4.99%.
How did you make the decision to wear seat belts then? If it is because you were taught to at a young age, or it is the law, then can you think of any safety precaution you take (or don’t take) because it prevents or mitigates a problem that you believe would have less than 50% chance of occurring any particular time you do not take the precaution?
Often we make decisions based on our vague feelings of uncertainty, which are difficult to describe as a probability that could be communicated to others or explicitly analyzed mathematically. This difficulty is a failure of introspection, but the uncertainty we feel does somewhat approximate Bayesian probability theory. Many biases represent the limits of this approximation.
I was arguing against:
with the implicit assumption that “best positions” are about states of the world, and not synonymous with “best decisions”.
I guess we need to go back to Z. M. Davis’s last paragraph, reproduced here for your convenience: