why should we stand by our intuitions disregard the opinions of more intelligent people?
Because no matter how intelligent the people are, the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions, as a result of evolutionary processes operating over centuries, millennia, and longer. So if there is a conflict, it’s far more probable that the intelligent people have made some mistake that we haven’t yet spotted.
I am reminded of a saying in programming (not sure who first said it) that goes something like this: It takes twice as much intelligence to debug a given program as to write it. Therefore, if you write the most complex program you are capable of writing, you are, by definition, not smart enough to debug it.
Because no matter how intelligent the people are, the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions, as a result of evolutionary processes operating over centuries, millennia, and longer.
This doesn’t make sense to me. The intelligent people are still humans, and can default to their intuition just like we can if they think that using unfiltered intuition would be the most accurate. And, by virtue of being more intelligent, they presumably have better/faster System 2 (deliberate) thinking, so if the particular problem being worked on does end up favoring careful thinking, they would be more accurate. Hence, the intelligent person would be at least as good as you.
Moreover, if the claim “the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions” actually implied that intuitions were orders of magnitude better, people would never use anything but their intuitions, because their intuitions would always be more accurate. This obviously is not how things work in practice.
I am reminded of a saying in programming (not sure who first said it) that goes something like this: It takes twice as much intelligence to debug a given program as to write it. Therefore, if you write the most complex program you are capable of writing, you are, by definition, not smart enough to debug it.
Not a good analogy, since the intelligent person would be able to write a program that is at least as good as yours, even if they aren’t able to debug yours. It doesn’t matter if the intelligent person can’t debug your program if they can write a buggy program that works better than your buggy program.
Hence, the intelligent person would be at least as good as you.
Yes, this reminds me of someone I talked to some years back, who insisted that she trusted people’s intuitions about weather more than the forecasts of the weatherman.
It was unhelpful to point out that the weatherman also has intuitions, and would report using those if they really had better results.
In this particular case, I agree with you that the weatherman is far more likely to be right than the person’s intuitions.
However, suppose the weatherman had said that since it’s going to be sunny tomorrow, it would be a good day to go out and murder people, and gives a logical argument to support that position? Should the woman still go with what the weatherman says, if she can’t find a flaw in his argument?
However, suppose the weatherman had said that since it’s going to be sunny tomorrow, it would be a good day to go out and murder people, and gives a logical argument to support that position? Should the woman still go with what the weatherman says, if she can’t find a flaw in his argument?
Well, I wouldn’t expect a weatherman to be an expert on murder, but he is an expert on weather, and due to the interdisciplinary nature of murder-weather-forecasting, I would not expect there to be many people in a better position to predict which days are good for murder.
If the woman is an expert on murder, or if she has conflicting reports from murder experts (e.g. “Only murder on dark and stormy nights”) she might have reason to doubt the weatherman’s claim about sunny days.
The intelligent people are still humans, and can default to their intuition just like we can if they think that using unfiltered intuition would be the most accurate.
But by hypothesis, we are talking about a scenario where the intelligent person is proposing something that violently clashes with an intuition that is supposed to be common to everyone. So we’re not talking about whether the intelligent person has an advantage in all situations, on average; we’re talking about whether the intelligent person has an advantage, on average, in that particular class of situations.
In other words, we’re talking about a situation where something has obviously gone wrong; the question is which is more likely to have gone wrong, the intuitions or the intelligent person. It doesn’t seem to me that your argument addresses that question.
if the claim “the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions” actually implied that intuitions were orders of magnitude better
That’s not what it implies; or at least, that’s not what I’m arguing it implies. I’m only arguing that it implies that, if we already know that something has gone wrong, if we have an obvious conflict between the intelligent person and the intuitions built up over the evolution of humans in general, it’s more likely that the intelligent person’s arguments have some mistake in them.
Also, there seems to be a bit of confusion about how the word “intuition” is being used. I’m not using it, and I don’t think the OP was using it, just to refer to “unexamined beliefs” or something like that. I’m using it to refer speciflcally to beliefs like “mass murder is wrong”, which have obvious reasonable grounds.
Not a good analogy, since the intelligent person would be able to write a program that is at least as good as yours, even if they aren’t able to debug yours. It doesn’t matter if the intelligent person can’t debug your program if they can write a buggy program that works better than your buggy program.
We’re not talking about the intelligent person being able to debug “your” program; we’re talking about the intelligent person not being able to debug his own program. And if he’s smarter than you, then obviously you can’t either. Also, we’re talking about a case where there is good reason to doubt whether the intelligent person’s program “works better”—it is in conflict with some obvious intuitive principle like “mass murder is wrong”.
Yes, but OTOH the “evolutionary processes operating over centuries, millennia, and longer” took place in environments different from where we live nowadays.
I think, more to the point is the question of what functions the evolutionary processes were computing. Those instincts did not evolve to provide insight into truth, they evolved to maximize reproductive fitness. Certainly these aren’t mutually
exclusive goals, but to a certain extent, that difference in function is why we have cognitive biases in the first place.
Obviously that’s an over simplification, but my point is that if we know something has gone wrong, and that there’s conflict between an intelligent person’s conclusions and the intuitions we’ve evolved, the high probability that the flaw’ is in the intelligent person’s argument depends on whether that instinct in some way produced more babies than it’s competitors.
This may or may not significantly decrease the probability distribution on expected errors assigned earlier, but I think it’s worth considering.
Because no matter how intelligent the people are, the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions, as a result of evolutionary processes operating over centuries, millennia, and longer. So if there is a conflict, it’s far more probable that the intelligent people have made some mistake that we haven’t yet spotted.
I am reminded of a saying in programming (not sure who first said it) that goes something like this: It takes twice as much intelligence to debug a given program as to write it. Therefore, if you write the most complex program you are capable of writing, you are, by definition, not smart enough to debug it.
This doesn’t make sense to me. The intelligent people are still humans, and can default to their intuition just like we can if they think that using unfiltered intuition would be the most accurate. And, by virtue of being more intelligent, they presumably have better/faster System 2 (deliberate) thinking, so if the particular problem being worked on does end up favoring careful thinking, they would be more accurate. Hence, the intelligent person would be at least as good as you.
Moreover, if the claim “the amount of computation that went into their opinions will be orders of magnitude smaller than the amount of computation that went into our intuitions” actually implied that intuitions were orders of magnitude better, people would never use anything but their intuitions, because their intuitions would always be more accurate. This obviously is not how things work in practice.
Not a good analogy, since the intelligent person would be able to write a program that is at least as good as yours, even if they aren’t able to debug yours. It doesn’t matter if the intelligent person can’t debug your program if they can write a buggy program that works better than your buggy program.
Yes, this reminds me of someone I talked to some years back, who insisted that she trusted people’s intuitions about weather more than the forecasts of the weatherman.
It was unhelpful to point out that the weatherman also has intuitions, and would report using those if they really had better results.
In this particular case, I agree with you that the weatherman is far more likely to be right than the person’s intuitions.
However, suppose the weatherman had said that since it’s going to be sunny tomorrow, it would be a good day to go out and murder people, and gives a logical argument to support that position? Should the woman still go with what the weatherman says, if she can’t find a flaw in his argument?
Well, I wouldn’t expect a weatherman to be an expert on murder, but he is an expert on weather, and due to the interdisciplinary nature of murder-weather-forecasting, I would not expect there to be many people in a better position to predict which days are good for murder.
If the woman is an expert on murder, or if she has conflicting reports from murder experts (e.g. “Only murder on dark and stormy nights”) she might have reason to doubt the weatherman’s claim about sunny days.
You don’t get it. Murder is NOT an abstract variable in the previous comment. It’s a constant.
I thought I understood what I was saying, but I don’t understand what you’re saying. What?
But by hypothesis, we are talking about a scenario where the intelligent person is proposing something that violently clashes with an intuition that is supposed to be common to everyone. So we’re not talking about whether the intelligent person has an advantage in all situations, on average; we’re talking about whether the intelligent person has an advantage, on average, in that particular class of situations.
In other words, we’re talking about a situation where something has obviously gone wrong; the question is which is more likely to have gone wrong, the intuitions or the intelligent person. It doesn’t seem to me that your argument addresses that question.
That’s not what it implies; or at least, that’s not what I’m arguing it implies. I’m only arguing that it implies that, if we already know that something has gone wrong, if we have an obvious conflict between the intelligent person and the intuitions built up over the evolution of humans in general, it’s more likely that the intelligent person’s arguments have some mistake in them.
Also, there seems to be a bit of confusion about how the word “intuition” is being used. I’m not using it, and I don’t think the OP was using it, just to refer to “unexamined beliefs” or something like that. I’m using it to refer speciflcally to beliefs like “mass murder is wrong”, which have obvious reasonable grounds.
We’re not talking about the intelligent person being able to debug “your” program; we’re talking about the intelligent person not being able to debug his own program. And if he’s smarter than you, then obviously you can’t either. Also, we’re talking about a case where there is good reason to doubt whether the intelligent person’s program “works better”—it is in conflict with some obvious intuitive principle like “mass murder is wrong”.
Yes, but OTOH the “evolutionary processes operating over centuries, millennia, and longer” took place in environments different from where we live nowadays.
I think, more to the point is the question of what functions the evolutionary processes were computing. Those instincts did not evolve to provide insight into truth, they evolved to maximize reproductive fitness. Certainly these aren’t mutually exclusive goals, but to a certain extent, that difference in function is why we have cognitive biases in the first place.
Obviously that’s an over simplification, but my point is that if we know something has gone wrong, and that there’s conflict between an intelligent person’s conclusions and the intuitions we’ve evolved, the high probability that the flaw’ is in the intelligent person’s argument depends on whether that instinct in some way produced more babies than it’s competitors.
This may or may not significantly decrease the probability distribution on expected errors assigned earlier, but I think it’s worth considering.