As a human mind, I have a built in default system of beliefs. That system is a crude “sounds plausible” intuition. This mostly works pretty well, but it isn’t perfect.
This crude system heard about probability theory, and assigned it a “seems true” marker. The background system, as used before learning probability theory, kind of roughly approximates part of probability theory. But it’s not a system that produces explicit numbers.
So I can’t assign a probability to baysianism being true, because the part of my mind that decided it was true isn’t using explicit probabilities, just feelings.
As a human mind, I have a built in default system of beliefs. That system is a crude “sounds plausible” intuition. This mostly works pretty well, but it isn’t perfect.
This crude system heard about probability theory, and assigned it a “seems true” marker. The background system, as used before learning probability theory, kind of roughly approximates part of probability theory. But it’s not a system that produces explicit numbers.
So I can’t assign a probability to baysianism being true, because the part of my mind that decided it was true isn’t using explicit probabilities, just feelings.