Well that depends on whether your aim is to make people have correct beliefs, or whether you want to make people have correct beliefs by following the ritual of rational argument… and I think that EY would claim to be aiming for the former.
What use is it to have correct beliefs if you don’t know they’re correct?
If the belief cannot be conveniently tested empirically, or it would be useless to do so, the only way we can know that our belief is correct is by being confident of the methodology through which we reached it.
What use is it to have correct beliefs if you don’t know they’re correct?
When I’m fleeing through an ancient temple with my trusty whip at my side, and I come to a fork in the road, I’ll take the path I belief leads to safety. This will turn out to be a wise choice, because the other one would lead me to a pit full of snakes, falling boulders and almost certainly walls that slowly but surely move closer and closer. That’s the sound of inevitability.
I naturally prefer to having enough evidence to be confident in my beliefs. Given time I would definitely look up the trusy map I was given of the doom riddled temple. I’d also get someone else to go through ahead of me just to make sure. However, my beliefs will inevitably determine what decisions I make.
To be honest I am a little confused about what that question means. It makes no sense to me, although I can see that someone would conceivably be able to wrangle their mind into that incoherent state. If they believe, but apparently don’t know that they believe then I assume that all their decisions are made in accordance with that belief but that they will describe their belief as though they are not confident in it.
“I naturally prefer to have a high level of confidence in my beliefs.”
Doesn’t that depend on how reliable those beliefs are?
If you’re fleeing through the temple pursued by a boulder, you don’t want to dither at an intersection, so whichever direction you think you should go at one moment should be constant. But there’s no reason why your confidence should be high to avoid dithering; you need merely be stable.
“’ll take the path I belief leads to safety. This will turn out to be a wise choice”
If, and only if, your belief is correct. If your belief is wrong your choice is a disastrous one. Rationality isn’t about being right or choosing the best course, it’s about knowing that you’re right and knowing which is the best course to choose.
Then I think I agree with you, mostly. If time or a similar limited resource makes rigorous justification too expensive, we shouldn’t require it. But whatever we do accept should be minimally justified, even if it’s just “I have no idea where to go so I’ll pick at random”.
I wouldn’t look at the map if I were running from the boulder. But I would have looked at it before entering the temple, and you can bet I’d be trying very hard to retrace my steps on the way out, unless I thought I could identify a shortcut. Even then I might not take the gamble.
Well that depends on whether your aim is to make people have correct beliefs, or whether you want to make people have correct beliefs by following the ritual of rational argument… and I think that EY would claim to be aiming for the former.
What use is it to have correct beliefs if you don’t know they’re correct?
If the belief cannot be conveniently tested empirically, or it would be useless to do so, the only way we can know that our belief is correct is by being confident of the methodology through which we reached it.
When I’m fleeing through an ancient temple with my trusty whip at my side, and I come to a fork in the road, I’ll take the path I belief leads to safety. This will turn out to be a wise choice, because the other one would lead me to a pit full of snakes, falling boulders and almost certainly walls that slowly but surely move closer and closer. That’s the sound of inevitability.
I naturally prefer to having enough evidence to be confident in my beliefs. Given time I would definitely look up the trusy map I was given of the doom riddled temple. I’d also get someone else to go through ahead of me just to make sure. However, my beliefs will inevitably determine what decisions I make.
To be honest I am a little confused about what that question means. It makes no sense to me, although I can see that someone would conceivably be able to wrangle their mind into that incoherent state. If they believe, but apparently don’t know that they believe then I assume that all their decisions are made in accordance with that belief but that they will describe their belief as though they are not confident in it.
“I naturally prefer to have a high level of confidence in my beliefs.”
Doesn’t that depend on how reliable those beliefs are?
If you’re fleeing through the temple pursued by a boulder, you don’t want to dither at an intersection, so whichever direction you think you should go at one moment should be constant. But there’s no reason why your confidence should be high to avoid dithering; you need merely be stable.
“’ll take the path I belief leads to safety. This will turn out to be a wise choice”
If, and only if, your belief is correct. If your belief is wrong your choice is a disastrous one. Rationality isn’t about being right or choosing the best course, it’s about knowing that you’re right and knowing which is the best course to choose.
Thanks Annoyance, I replaced ‘have a high level of confidence’ with ‘having enough evidence to be confident’. That makes my intended meaning clearer.
Then I think I agree with you, mostly. If time or a similar limited resource makes rigorous justification too expensive, we shouldn’t require it. But whatever we do accept should be minimally justified, even if it’s just “I have no idea where to go so I’ll pick at random”.
I wouldn’t look at the map if I were running from the boulder. But I would have looked at it before entering the temple, and you can bet I’d be trying very hard to retrace my steps on the way out, unless I thought I could identify a shortcut. Even then I might not take the gamble.