(As likelihood ratios get smaller, your priors need to be better and your updates more accurate.)
It seems to me that rationality is more about updating the correct amount, which is primarily calculating the likelihood ratio correctly. Most of the examples of philosophical errors you’ve discussed come from not calculating that ratio correctly, not from starting out with a bizarre prior.
Upon hearing this, my response was “How the stars was this actually a real debate? Of course we have mental imagery. Anyone who doesn’t think we have mental imagery is either such a fanatical Behaviorist that she doubts the evidence of her own senses, or simply insane.”
This looks like having the same prior as many other people; the rationality was in actually running the experiment and calculating the likelihood ratio, which was able to overcome the extreme prior. You could say that Galton only considered this because he had a non-extreme prior, and that if people trusted their intuitions less and had more curious agnosticism, their beliefs would converge faster. But it seems to me that the curiosity (i.e. looking for evidence that favors one hypothesis over another) is more important than the agnosticism- the goal is not “I could be wrong” but “I could be wrong if X.”
It seems to me that rationality is more about updating the correct amount, which is primarily calculating the likelihood ratio correctly. Most of the examples of philosophical errors you’ve discussed come from not calculating that ratio correctly, not from starting out with a bizarre prior.
For example, consider Yvain and the Case of the Visual Imagination:
This looks like having the same prior as many other people; the rationality was in actually running the experiment and calculating the likelihood ratio, which was able to overcome the extreme prior. You could say that Galton only considered this because he had a non-extreme prior, and that if people trusted their intuitions less and had more curious agnosticism, their beliefs would converge faster. But it seems to me that the curiosity (i.e. looking for evidence that favors one hypothesis over another) is more important than the agnosticism- the goal is not “I could be wrong” but “I could be wrong if X.”