The answer is that human minds are not Bayesian, nor is it possible for them to become such. For just about any interesting question you may ask, the algorithm that your brain uses to find the answer is not transparent to your consciousness—and its output doesn’t include a numerical probability estimate, merely a vague and coarsely graded feeling of certainty. The only exceptions are situations where a phenomenon can be modeled mathematically in a way that allows you to work through the probability calculations explicitly, but even then, your confidence that the model captures reality ultimately comes down to a common-sense judgment produced by your non-transparent brain circuits.
In your concrete example, if you’re knowledgeable about politics, you can have a good hunch for how likely a certain future election outcome is. But this insight is produced by a mostly opaque process in your brain, which doesn’t give you any numerical probabilities. This is not a problem you can attack with an explicit mathematical calculation, and even if you devised a way to do so, the output of this calculation would be altogether different from the conclusion you’ll make using common sense, and it makes no sense to assign the probability calculated by the former to the latter.
Therefore, insisting on attaching a numerical probability to your common-sense conclusions makes no sense, except insofar as such numbers are sometimes used as vague figures of speech.
The answer is that human minds are not Bayesian, nor is it possible for them to become such. For just about any interesting question you may ask, the algorithm that your brain uses to find the answer is not transparent to your consciousness—and its output doesn’t include a numerical probability estimate, merely a vague and coarsely graded feeling of certainty. The only exceptions are situations where a phenomenon can be modeled mathematically in a way that allows you to work through the probability calculations explicitly, but even then, your confidence that the model captures reality ultimately comes down to a common-sense judgment produced by your non-transparent brain circuits.
In your concrete example, if you’re knowledgeable about politics, you can have a good hunch for how likely a certain future election outcome is. But this insight is produced by a mostly opaque process in your brain, which doesn’t give you any numerical probabilities. This is not a problem you can attack with an explicit mathematical calculation, and even if you devised a way to do so, the output of this calculation would be altogether different from the conclusion you’ll make using common sense, and it makes no sense to assign the probability calculated by the former to the latter.
Therefore, insisting on attaching a numerical probability to your common-sense conclusions makes no sense, except insofar as such numbers are sometimes used as vague figures of speech.
But attaching those estimates is clearly useful.
Consider training: predictionbook.com