The problem is that, in general, there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field. A book of internally consistent lies, such as Chariots of the Gods? will seem as plausible as any book written about real history to someone who doesn’t already know that it’s a book of lies.
...there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field.
That sounds like a promising strategy to me. At least it is far better than what people currently do, which is adopt what their friends think, or ideas they find appealing for other reasons. No doubt it would be better if more people were capable of evaluating scientific theory and evidence themselves, but imagine how much better things would be if people simply asked themselves, “Which is the relevant community of experts, how are opinions on this issue distributed amongst the experts, how reliable have similar experts been in the past? e.g. chemists are generally less wrong about chemistry than psychologists are about psychology. This would be a step in the right direction.
That’s not quite true. There are ways of evaluating an expert—but people don’t like them, don’t implement them, and don’t try to find out what they are.
Many, many people who have the social status and authority of experts simply don’t know what they’re talking about. They can be detected by an earnest and diligent inquiry, combined with a healthy and balanced skepticism.
The problem is that, in general, there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field. A book of internally consistent lies, such as Chariots of the Gods? will seem as plausible as any book written about real history to someone who doesn’t already know that it’s a book of lies.
That sounds like a promising strategy to me. At least it is far better than what people currently do, which is adopt what their friends think, or ideas they find appealing for other reasons. No doubt it would be better if more people were capable of evaluating scientific theory and evidence themselves, but imagine how much better things would be if people simply asked themselves, “Which is the relevant community of experts, how are opinions on this issue distributed amongst the experts, how reliable have similar experts been in the past? e.g. chemists are generally less wrong about chemistry than psychologists are about psychology. This would be a step in the right direction.
That’s not quite true. There are ways of evaluating an expert—but people don’t like them, don’t implement them, and don’t try to find out what they are.
Many, many people who have the social status and authority of experts simply don’t know what they’re talking about. They can be detected by an earnest and diligent inquiry, combined with a healthy and balanced skepticism.
Doctors are a prime example.
Unfortunately, many of those ways are equivalent to “become an expert yourself”. :(
But how do you know when you’ve become an expert?
Turtles all the way down!