I’ve also observed that people who come to believe that there are significant differences between the sexes/races/whatevers on average begin to discriminate against all individuals of the disadvantaged sex/race/whatever, even when they were only persuaded by scientific results they believed to be accurate and were reluctant to accept that conclusion. I have watched this happen to smart people more than once. Furthermore, I have never met (or read the writings of) any person who believed in fundamental differences between the whatevers and who was not also to some degree a bigot.
One specific and relatively common version of this are people who believe that women have a lower standard deviation on measures of IQ than men. This belief is not incompatible with believing that any particular woman might be astonishingly intelligent, but these people all seem to have a great deal of trouble applying the latter to any particular woman. There may be exceptions, but I haven’t met them.
The rest of the post was good, but these claims seem far too anecdotal and availability heuristicky to justify blocking yourself out of an entire area of inquiry.
When well-meaning, intelligent people like yourself refuse to examine certain areas of controversy, you consign those discourses to people with less-enlightened social attitudes. When certain beliefs are outlawed, only outlaws will hold those beliefs.
SarahC has raised some alternative ideas about how people may respond to dangerous knowledge.
As for:
Furthermore, I have never met (or read the writings of) any person who believed in fundamental differences between the whatevers and who was not also to some degree a bigot.
Why are you so comfortable with such a hasty generalization? I’m not extremely widely-read on the subject of group differences, but I’ve run into some writing on the subject by people who doesn’t seem to be bigots. See Gender, Nature, and Nurture by Richard Lippa, for instance.
Why would you make a hasty generalization and then shut yourself off to evidence that could disconfirm it?
A consequence of this is that your brain is not a trusted system, which itself has consequences that go much, much deeper than a bunch of misapplied heuristics. (And those are bad enough on their own!)
Your post itself demonstrates this. You are accepting certain empirical and moral beliefs that have not been justified, such as the notion of cognitive equality between groups. Regardless of whether this hypothesis is true or not, it seems to get inordinately privileged for ideological reasons. (In my view, suspended judgment on group differences is a more rational initial attitude.)
Privileging certain hypotheses for mainly ideological reasons is not rationality, even when your ideology is really warm and fuzzy.
If you are comfortable freezing your belief system in certain areas, that’s a strong symptom that your mind got hacked somewhere, and the virus is so bad that it is disabled your own epistemic immune system.
Personally, like simplicio, I’m not comfortable pulling an ostrich maneuver and basing my values on empirical notions that could turn out to be lies. What a great way to destroy my own conviction in my values! I would prefer to investigate these subjects, even at risk of shaking up my values. So far, like SarahC, I haven’t found my values to be shaken up all that much (though maybe I’m biased in that perception).
I think it may be helpful to clearly distinguish between epistemic and instrumental rationality. The idea proposed in this post is actively detrimental to the pursuit of epistemic rationality; I should have acknowledged that more clearly up front.
But if one is more concerned with instrumental rationality (“winning”), then perhaps there is more value here. If you’ve designated a particular goal state as a winning one and then, after playing for a while, unconsciously decided to change which goal state counts as a win, then from the perspective of the you that began the game, you’ve lost.
I do agree that my last example was massively under-justified, especially considering the breadth of the claim.
The rest of the post was good, but these claims seem far too anecdotal and availability heuristicky to justify blocking yourself out of an entire area of inquiry.
When well-meaning, intelligent people like yourself refuse to examine certain areas of controversy, you consign those discourses to people with less-enlightened social attitudes. When certain beliefs are outlawed, only outlaws will hold those beliefs.
SarahC has raised some alternative ideas about how people may respond to dangerous knowledge.
As for:
Why are you so comfortable with such a hasty generalization? I’m not extremely widely-read on the subject of group differences, but I’ve run into some writing on the subject by people who doesn’t seem to be bigots. See Gender, Nature, and Nurture by Richard Lippa, for instance.
Why would you make a hasty generalization and then shut yourself off to evidence that could disconfirm it?
Your post itself demonstrates this. You are accepting certain empirical and moral beliefs that have not been justified, such as the notion of cognitive equality between groups. Regardless of whether this hypothesis is true or not, it seems to get inordinately privileged for ideological reasons. (In my view, suspended judgment on group differences is a more rational initial attitude.)
Privileging certain hypotheses for mainly ideological reasons is not rationality, even when your ideology is really warm and fuzzy.
If you are comfortable freezing your belief system in certain areas, that’s a strong symptom that your mind got hacked somewhere, and the virus is so bad that it is disabled your own epistemic immune system.
Personally, like simplicio, I’m not comfortable pulling an ostrich maneuver and basing my values on empirical notions that could turn out to be lies. What a great way to destroy my own conviction in my values! I would prefer to investigate these subjects, even at risk of shaking up my values. So far, like SarahC, I haven’t found my values to be shaken up all that much (though maybe I’m biased in that perception).
I think it may be helpful to clearly distinguish between epistemic and instrumental rationality. The idea proposed in this post is actively detrimental to the pursuit of epistemic rationality; I should have acknowledged that more clearly up front.
But if one is more concerned with instrumental rationality (“winning”), then perhaps there is more value here. If you’ve designated a particular goal state as a winning one and then, after playing for a while, unconsciously decided to change which goal state counts as a win, then from the perspective of the you that began the game, you’ve lost.
I do agree that my last example was massively under-justified, especially considering the breadth of the claim.