I agree with your point about the absurdity heuristic.
However, I’m not sure that the three following points are relevant to the case of theism:
If a large number of intelligent people believe something, it deserves your attention. After you’ve studied it on its own terms, then you have a right to reject it. You could still be wrong, though.
Even if you can think of a good reason why people might be biased towards the silly idea, thus explaining it away, your good reason may still be false.
If someone cannot explain why something is not stupid to you over twenty minutes at a cafe, that doesn’t mean it’s stupid. It just means it’s complicated, or they’re not very good at explaining things.
What if we have strong evidence that the people who hold the seemingly absurd belief all have similar biases? More, what if a very large fraction of these people admit that they’re biased, and are even proud of it?
That’s exactly what the situation is with regard to theism, of course. Most theists admit that their religious beliefs are based only, or mostly, on faith. Some state it outright, others hide it behind circumlocutions and nebulous metaphors, and yet others need to be pushed a bit before they’ll admit it, but the result is the same.
Does it still matter, then, that many of these people are intelligent, or that some of these religious beliefs may be very complex, or that I haven’t studied some of them with great attention?
In general, I think taking Yvain’s advice is probably better than not.
If the group of (otherwise) intelligent people is a group of anosognosiacs that tell you that they aren’t disabled, then their common bias is probably sufficient to dismiss them. In most cases, however, we don’t have the line by line code that produces bad answers.
If you can’t look under the hood and say “The car doesn’t work because part x is doing y instead of z”, but rather “I’m not sure exactly how this thing is supposed to work, but something is leaking”, then you can’t be sure it isn’t going to do something right, even if its performance is suboptimal. There could be a hint of rationality buried in the muck, and in that case, they might manage to get it right.
One example of this is the “magical collapse” deal in QM. It may sound absurd (because it is), but the physicists talking about it aren’t stupid, and will still manage to give correct predictions. If you dismissed it as “entirely irrational” before looking at the evidence, you’d be stuck with classical physics.
If you then consider the fact that people tend to exaggerate other peoples biases (it’s always the other guy that is biased, right?), the case for at least putting some thought into it gets stronger.
One example of this is the “magical collapse” deal in QM. It may sound absurd (because it is)
Which alternative seems more absurd to you can depend a lot on what else you know or think you know. Many worlds seemed far more absurd to me than collapse until someone properly explained it to me.
Another point is that you won’t actually encounter all that many obviously-false beliefs widely held by intelligent people. Taking the effort to check out the ones you do encounter shouldn’t be an onerous effort.
I agree with your point about the absurdity heuristic.
However, I’m not sure that the three following points are relevant to the case of theism:
What if we have strong evidence that the people who hold the seemingly absurd belief all have similar biases? More, what if a very large fraction of these people admit that they’re biased, and are even proud of it?
That’s exactly what the situation is with regard to theism, of course. Most theists admit that their religious beliefs are based only, or mostly, on faith. Some state it outright, others hide it behind circumlocutions and nebulous metaphors, and yet others need to be pushed a bit before they’ll admit it, but the result is the same.
Does it still matter, then, that many of these people are intelligent, or that some of these religious beliefs may be very complex, or that I haven’t studied some of them with great attention?
In general, I think taking Yvain’s advice is probably better than not.
If the group of (otherwise) intelligent people is a group of anosognosiacs that tell you that they aren’t disabled, then their common bias is probably sufficient to dismiss them. In most cases, however, we don’t have the line by line code that produces bad answers.
If you can’t look under the hood and say “The car doesn’t work because part x is doing y instead of z”, but rather “I’m not sure exactly how this thing is supposed to work, but something is leaking”, then you can’t be sure it isn’t going to do something right, even if its performance is suboptimal. There could be a hint of rationality buried in the muck, and in that case, they might manage to get it right.
One example of this is the “magical collapse” deal in QM. It may sound absurd (because it is), but the physicists talking about it aren’t stupid, and will still manage to give correct predictions. If you dismissed it as “entirely irrational” before looking at the evidence, you’d be stuck with classical physics.
If you then consider the fact that people tend to exaggerate other peoples biases (it’s always the other guy that is biased, right?), the case for at least putting some thought into it gets stronger.
Which alternative seems more absurd to you can depend a lot on what else you know or think you know. Many worlds seemed far more absurd to me than collapse until someone properly explained it to me.
(months later...)
Another point is that you won’t actually encounter all that many obviously-false beliefs widely held by intelligent people. Taking the effort to check out the ones you do encounter shouldn’t be an onerous effort.