The problem with this is that the alternative epistemic position also thinks that your position is not as well-informed and well-supported as your own. Are they justified as well?
Yes, the key issue is not so much whether on a first analysis you came to think those other folks are not as well informed as you, but whether you would have thought that if you had been taught by them. The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better. Once you see that on a simple first analysis you would each think the other less informed, you must realize that the problem is harder than you had realized and you need to re-evaluate your reasons for so easily thinking they are wrong and you are right. Until you can find a style of analysis that would have convinced you, had you grown up among them, to convert to this side, it is hard to believe you’ve overcome this bias.
The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better.
Well, that’s one issue. But I was addressing a different—more theoretical—issue, namely, whether acknowledging the contingency of one’s beliefs (i.e. that one would have believed differently if raised differently) necessarily undermines epistemic justification.
(Recall the distinction between third-personal ‘accounts’ of rational justification and first-personal ‘instruction manuals’.)
“Necessarily” is an extremely strong claim, making it overwhelming likely that such a claim is false. So why ever would that be an interesting issue? And to me, first-person instruction manuals seem obviously more important than third-person “accounts”.
I get the impression that many (even most) of the commenters here think that acknowledged contingency thereby undermines a belief. But if you agree with me that this is much too quick, then we face the interesting problem of specifying exactly when acknowledged contingency undermines justification.
I don’t know what you mean by “important”. I would agree that the instruction manual question is obviously of greater practical importance, e.g. for those whose interest in the theory of rationality is merely instrumental. But to come up with an account of epistemic justification seems of equal or greater theoretical importance, to philosophers and others who have an intrinsic interest in the topic.
It’s also worth noting that the theoretical task could help inform the practical one. For example, the post on ‘skepticism and default trust’ (linked in my original comment) argues that some self-acknowledged ‘epistemic luck’ is necessary to avoid radical skepticism. This suggests a practical conclusion: if you hope to acquire any knowledge at all, your instruction manual will need to avoid being too averse to this outcome.
The vast majority of claims people make in ordinary language are best interpreted as on-average-tendency or all-else-equal claims; it almost never makes sense to interpret them as logical necessities. Why should this particular case be any different?
Well, they might be just as internally consistent (in some weak, subjective, sense). But if this kind of internal consistency or ratification suffices for justification, then there’s no “epistemic luck” involved after all. Both believers might know full well that their own views are self-endorsing.
I was instead thinking that self-ratifying principles were necessary for full justification. On top of that, it may just be a brute epistemic fact which of (say) occamism and anti-occamism is really justified. Then two people might have formally similar beliefs, and each sticks to their own guns in light of the other’s disagreement (which they view as a product of the other’s epistemic stunting), and yet only one of the two is actually right (justified) to do so. But that’s because only one of the two views was really justifiable in the first place: the actual disagreement may play no (or little) essential role, on this way of looking at things.
The problem with this is that the alternative epistemic position also thinks that your position is not as well-informed and well-supported as your own. Are they justified as well?
Yes, the key issue is not so much whether on a first analysis you came to think those other folks are not as well informed as you, but whether you would have thought that if you had been taught by them. The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better. Once you see that on a simple first analysis you would each think the other less informed, you must realize that the problem is harder than you had realized and you need to re-evaluate your reasons for so easily thinking they are wrong and you are right. Until you can find a style of analysis that would have convinced you, had you grown up among them, to convert to this side, it is hard to believe you’ve overcome this bias.
Robin Hanson just ended a post with the phrase “overcome bias.” This feels momentous, like theme music should be playing.
May I suggest the following?
http://www.youtube.com/watch?v=cSZ55X3X4pk
http://tvtropes.org/pmwiki/pmwiki.php/Main/TitleDrop
Well, that’s one issue. But I was addressing a different—more theoretical—issue, namely, whether acknowledging the contingency of one’s beliefs (i.e. that one would have believed differently if raised differently) necessarily undermines epistemic justification.
(Recall the distinction between third-personal ‘accounts’ of rational justification and first-personal ‘instruction manuals’.)
“Necessarily” is an extremely strong claim, making it overwhelming likely that such a claim is false. So why ever would that be an interesting issue? And to me, first-person instruction manuals seem obviously more important than third-person “accounts”.
I get the impression that many (even most) of the commenters here think that acknowledged contingency thereby undermines a belief. But if you agree with me that this is much too quick, then we face the interesting problem of specifying exactly when acknowledged contingency undermines justification.
I don’t know what you mean by “important”. I would agree that the instruction manual question is obviously of greater practical importance, e.g. for those whose interest in the theory of rationality is merely instrumental. But to come up with an account of epistemic justification seems of equal or greater theoretical importance, to philosophers and others who have an intrinsic interest in the topic.
It’s also worth noting that the theoretical task could help inform the practical one. For example, the post on ‘skepticism and default trust’ (linked in my original comment) argues that some self-acknowledged ‘epistemic luck’ is necessary to avoid radical skepticism. This suggests a practical conclusion: if you hope to acquire any knowledge at all, your instruction manual will need to avoid being too averse to this outcome.
The vast majority of claims people make in ordinary language are best interpreted as on-average-tendency or all-else-equal claims; it almost never makes sense to interpret them as logical necessities. Why should this particular case be any different?
Well, they might be just as internally consistent (in some weak, subjective, sense). But if this kind of internal consistency or ratification suffices for justification, then there’s no “epistemic luck” involved after all. Both believers might know full well that their own views are self-endorsing.
I was instead thinking that self-ratifying principles were necessary for full justification. On top of that, it may just be a brute epistemic fact which of (say) occamism and anti-occamism is really justified. Then two people might have formally similar beliefs, and each sticks to their own guns in light of the other’s disagreement (which they view as a product of the other’s epistemic stunting), and yet only one of the two is actually right (justified) to do so. But that’s because only one of the two views was really justifiable in the first place: the actual disagreement may play no (or little) essential role, on this way of looking at things.
For further background, see my discussion of Personal Bias and Peer Disagreement.