One trick that might help here is not considering beliefs themselves but actions upon those beliefs. Just because you have 0.0001% certainty the Moon is made of green cheese doesn’t mean you can make 0.0001% of a spaceship and hop over for a meal—you have to either build the ship or not build it, and the expected return from building is going to be somewhat small. Likewise, just because there’s a chance that humans and chimpanzees have 98% shared DNA entirely by accident does not mean it’s rational to actually act on that chance, even if you’re going through the mental effort of actually considering the possibility.
Granted, this approach is likely to just confuse people, perhaps making them think they are “allowed” to hold unlikely beliefs as long as they don’t act on said beliefs… but maybe it’s worth a try in the right situation?
This is deep wisdom. It also has a lot of resonance with the issue of risk, and what sorts of risks it is rational to take.
(And don’t tell me “expected utility”, because either the utility is what you’d straightforwardly expect---10000 people = 1 person * 10000---and you run into all sorts of weird conclusions, or else you do what von Neumann and Morgenstern did and redefine “utility” to mean “whatever it is you use to choose”. Great; now what do I use to choose?)
One trick that might help here is not considering beliefs themselves but actions upon those beliefs. Just because you have 0.0001% certainty the Moon is made of green cheese doesn’t mean you can make 0.0001% of a spaceship and hop over for a meal—you have to either build the ship or not build it, and the expected return from building is going to be somewhat small. Likewise, just because there’s a chance that humans and chimpanzees have 98% shared DNA entirely by accident does not mean it’s rational to actually act on that chance, even if you’re going through the mental effort of actually considering the possibility.
Granted, this approach is likely to just confuse people, perhaps making them think they are “allowed” to hold unlikely beliefs as long as they don’t act on said beliefs… but maybe it’s worth a try in the right situation?
This is deep wisdom. It also has a lot of resonance with the issue of risk, and what sorts of risks it is rational to take.
(And don’t tell me “expected utility”, because either the utility is what you’d straightforwardly expect---10000 people = 1 person * 10000---and you run into all sorts of weird conclusions, or else you do what von Neumann and Morgenstern did and redefine “utility” to mean “whatever it is you use to choose”. Great; now what do I use to choose?)