If people listened to intelligent and careful thinkers they wouldn’t need to understand it themselves. Whether this is an easier or harder route is unclear to me.
The problem is that, in general, there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field. A book of internally consistent lies, such as Chariots of the Gods? will seem as plausible as any book written about real history to someone who doesn’t already know that it’s a book of lies.
...there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field.
That sounds like a promising strategy to me. At least it is far better than what people currently do, which is adopt what their friends think, or ideas they find appealing for other reasons. No doubt it would be better if more people were capable of evaluating scientific theory and evidence themselves, but imagine how much better things would be if people simply asked themselves, “Which is the relevant community of experts, how are opinions on this issue distributed amongst the experts, how reliable have similar experts been in the past? e.g. chemists are generally less wrong about chemistry than psychologists are about psychology. This would be a step in the right direction.
That’s not quite true. There are ways of evaluating an expert—but people don’t like them, don’t implement them, and don’t try to find out what they are.
Many, many people who have the social status and authority of experts simply don’t know what they’re talking about. They can be detected by an earnest and diligent inquiry, combined with a healthy and balanced skepticism.
Indeed. Now that I think about it, perhaps the real problem here is that the marginal social status payoff from an increase in IQ is too low (perhaps even negative in some cases); in other words, IQ doesn’t buy one enough status. So the question is whether it is easier to fix this than just to raise the IQ baseline.
How does increasing “the marginal social status payoff from an increase in IQ” help? I’m not saying it would hurt, but it seems less direct and less important than increasing the marginal social status payoff from having and acting on unbiased beliefs about the world because this is something people can change fairly easily.
How does increasing “the marginal social status payoff from an increase in IQ” help?
The implication may be that persons with high IQ are often prevented from putting it to a meaningful use due to the way societies are structured: a statement I agree with.
persons with high IQ are often prevented from putting it to a meaningful use due to the way societies are structured.
Do you mean that organizations aren’t very good at selecting the best person for each job. I agree with that statement, but its about much, much, more than IQ. It is a tough nut to crack but I have given some thought to how we could improve honest signaling of people’s skills.
Do you mean that organizations aren’t very good at selecting the best person for each job.
Actually, no. What I mean is that human society isn’t very good at realizing that it would be in its best interest to assign as many high-IQ persons as possible the job of “being themselves” full-time and freely developing their ideas—without having to justify their short-term benefit.
Hell, forget “as many as possible”, we don’t even have a Bell Labs any more.
This, I think, is a special case of what I meant. A simple, crude, way to put the general point is that people don’t defer enough to those who are smarter. If they did, smart folks would be held in higher esteem by society, and indeed would consequently have greater autonomy.
That may well be right. I’m willing to accept that the distinction between “I.Q.” and other measures of “smartness” is orthogonal to the point I was making.
If people listened to intelligent and careful thinkers they wouldn’t need to understand it themselves. Whether this is an easier or harder route is unclear to me.
The problem is that, in general, there’s no good way for a layman to tell the difference between Carl Sagan and Immanuel Velikovsky, except by comparing them to other people who claim to be experts in a field. A book of internally consistent lies, such as Chariots of the Gods? will seem as plausible as any book written about real history to someone who doesn’t already know that it’s a book of lies.
That sounds like a promising strategy to me. At least it is far better than what people currently do, which is adopt what their friends think, or ideas they find appealing for other reasons. No doubt it would be better if more people were capable of evaluating scientific theory and evidence themselves, but imagine how much better things would be if people simply asked themselves, “Which is the relevant community of experts, how are opinions on this issue distributed amongst the experts, how reliable have similar experts been in the past? e.g. chemists are generally less wrong about chemistry than psychologists are about psychology. This would be a step in the right direction.
That’s not quite true. There are ways of evaluating an expert—but people don’t like them, don’t implement them, and don’t try to find out what they are.
Many, many people who have the social status and authority of experts simply don’t know what they’re talking about. They can be detected by an earnest and diligent inquiry, combined with a healthy and balanced skepticism.
Doctors are a prime example.
Unfortunately, many of those ways are equivalent to “become an expert yourself”. :(
But how do you know when you’ve become an expert?
Turtles all the way down!
Indeed. Now that I think about it, perhaps the real problem here is that the marginal social status payoff from an increase in IQ is too low (perhaps even negative in some cases); in other words, IQ doesn’t buy one enough status. So the question is whether it is easier to fix this than just to raise the IQ baseline.
How does increasing “the marginal social status payoff from an increase in IQ” help? I’m not saying it would hurt, but it seems less direct and less important than increasing the marginal social status payoff from having and acting on unbiased beliefs about the world because this is something people can change fairly easily.
The implication may be that persons with high IQ are often prevented from putting it to a meaningful use due to the way societies are structured: a statement I agree with.
Do you mean that organizations aren’t very good at selecting the best person for each job. I agree with that statement, but its about much, much, more than IQ. It is a tough nut to crack but I have given some thought to how we could improve honest signaling of people’s skills.
Actually, no. What I mean is that human society isn’t very good at realizing that it would be in its best interest to assign as many high-IQ persons as possible the job of “being themselves” full-time and freely developing their ideas—without having to justify their short-term benefit.
Hell, forget “as many as possible”, we don’t even have a Bell Labs any more.
This, I think, is a special case of what I meant. A simple, crude, way to put the general point is that people don’t defer enough to those who are smarter. If they did, smart folks would be held in higher esteem by society, and indeed would consequently have greater autonomy.
How should society implement this? I repeat my claim that other personal characteristics are as important as IQ.
I do not know of a working society-wide solution. Establishing research institutes in the tradition of Bell Labs would be a good start, though.
That may well be right. I’m willing to accept that the distinction between “I.Q.” and other measures of “smartness” is orthogonal to the point I was making.