Suppose I think, after doing my accounts, that I have a large balance at the bank. And suppose you want to find out whether this belief of mine is “wishful thinking.” You can never come to any conclusion by examining my psychological condition. Your only chance of finding out is to sit down and work through the sum yourself. When you have checked my figures, then, and then only, will you know whether I have that balance or not. If you find my arithmetic correct, then no amount of vapouring about my psychological condition can be anything but a waste of time. If you find my arithmetic wrong, then it may be relevant to explain psychologically how I came to be so bad at my arithmetic, and the doctrine of the concealed wish will become relevant — but only after you have yourself done the sum and discovered me to be wrong on purely arithmetical grounds. It is the same with all thinking and all systems of thought. If you try to find out which are tainted by speculating about the wishes of the thinkers, you are merely making a fool of yourself. You must first find out on purely logical grounds which of them do, in fact, break down as arguments. Afterwards, if you like, go on and discover the psychological causes of the error.
C.S. Lewis, “Bulverism”
(It’s not exactly correct- evidence of bias is some evidence against a belief- but not always as strong of evidence as it’s assumed to be.)
I’ve actually always found C.S. Lewis to be one of the single most fascinating and compelling Christian writers. Obviously I think he makes some very fundamental mistakes, but his approach to Christianity is about as rationalist as you can get. He really emphasizes that if you’re going to believe in something, it better really be true not just “worth believing in” or “virtuous” or “helpful”—he himself could have written Belief in Belief. Furthermore, he seems committed to a conception of “faith” that doesn’t involve any conflict with rationality—he thinks that the logical arguments for the existence of God do a lot of work, and he’s fairly sophisticated scientifically (seems reasonably knowledgeable about evolution, quantum mechanics, etc.). I would actually highly recommend The Screwtape Letters to any rationalists who find religious arguments interesting (if not compelling).
He really emphasizes that if you’re going to believe in something, it better really be true not just “worth believing in” or “virtuous” or “helpful”—he himself could have written Belief in Belief.
One gets that impression if one reads Mere Christianity and the Screwtape Letters. But if one reads his works aimed at children one gets the impression that he wants children to believe despite evidence. See for example the scene in The Silver Chair where the protagonists are trapped underground and the Lady of the Green Kirtle tries to enchant them to think that Narnia, Aslan and the Sun are all things they made up as part of a game. They are almost taken in until they declare that they will believe in Aslan even if there’s is no Aslan because the world they’ve imagined if it has been imagined is a better world than the one they live in.
You can never come to any conclusion by examining my psychological condition
On the contrary, in the absence of the time, resources, or inclination to completely retrace a person’s reasoning, psychological factors (such as whether the result is desirable to the person in question) are indeed relevant to the probability that the person made a mistake since in general
P(made mistake | result is appealing) != P(made mistake | result not appealing)
Why should evidence of bias be some evidence against a belief? This would be like magic: using someone’s failure of rationality to learn something about the world, which is absurd. (Example: Federer’s wife is very confident that he will win, because she is biased in his favor. Does this give me any reason to bet against Federer? Obviously not.)
If you find out that someone believes A then that’s evidence for A, so your beliefs change away from the priors. If you subsequently find that the person is likely biased then your beliefs return some way toward your priors. So finding out about the bias was in some sense evidence about A.
To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.
C.S. Lewis, “Bulverism”
(It’s not exactly correct- evidence of bias is some evidence against a belief- but not always as strong of evidence as it’s assumed to be.)
I’ve actually always found C.S. Lewis to be one of the single most fascinating and compelling Christian writers. Obviously I think he makes some very fundamental mistakes, but his approach to Christianity is about as rationalist as you can get. He really emphasizes that if you’re going to believe in something, it better really be true not just “worth believing in” or “virtuous” or “helpful”—he himself could have written Belief in Belief. Furthermore, he seems committed to a conception of “faith” that doesn’t involve any conflict with rationality—he thinks that the logical arguments for the existence of God do a lot of work, and he’s fairly sophisticated scientifically (seems reasonably knowledgeable about evolution, quantum mechanics, etc.). I would actually highly recommend The Screwtape Letters to any rationalists who find religious arguments interesting (if not compelling).
One gets that impression if one reads Mere Christianity and the Screwtape Letters. But if one reads his works aimed at children one gets the impression that he wants children to believe despite evidence. See for example the scene in The Silver Chair where the protagonists are trapped underground and the Lady of the Green Kirtle tries to enchant them to think that Narnia, Aslan and the Sun are all things they made up as part of a game. They are almost taken in until they declare that they will believe in Aslan even if there’s is no Aslan because the world they’ve imagined if it has been imagined is a better world than the one they live in.
He’s proof that you can develop a quite rational account of human psychology, and then use it to shoot yourself in the foot.
(Thanks to ciphergoth for the pointer to this quote.)
On the contrary, in the absence of the time, resources, or inclination to completely retrace a person’s reasoning, psychological factors (such as whether the result is desirable to the person in question) are indeed relevant to the probability that the person made a mistake since in general
P(made mistake | result is appealing) != P(made mistake | result not appealing)
Why should evidence of bias be some evidence against a belief? This would be like magic: using someone’s failure of rationality to learn something about the world, which is absurd. (Example: Federer’s wife is very confident that he will win, because she is biased in his favor. Does this give me any reason to bet against Federer? Obviously not.)
If you find out that someone believes A then that’s evidence for A, so your beliefs change away from the priors. If you subsequently find that the person is likely biased then your beliefs return some way toward your priors. So finding out about the bias was in some sense evidence about A.
To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.