Though I do tend to be contrarian, I’ve always thought that acting independently from others is the correct stance. Does everyone agree that being contrarian or conformist are both forms of bias to be avoided? I think that at best they can be seen as very weak/indirect reasons to believe something or do something, and only relative to your context. (You need to pick your battles as a contrarian and you need to break from conforming with the wrong people as a conformist)
This is an interesting question. I definitely agree that being a contrarian and being a conformist can both be forms of bias. However, I would add one example which suggests that conformity can in some cases be a positive instinct.
I have never studied general relativity in depth. My belief that “general relativity is right” is based on the heuristics, “most scientists believe in general relativity,” and “things that most scientists believe are usually right.” In part I think it’s also based on the fact that I know that evidence and arguments are available which everybody claims to be very strong.
To show that most of my belief in general relativity comes from popularity-based heuristics, consider the following scenario. Somebody proposes a unified field theory (UFT-1). They claim that evidence and arguments are available which would would convince me that the theory is right. Furthermore, they are the only person who believes in UFT-1. To eliminate further confounding variables, let us suppose that UFT-1 has existed for 35 years and has been examined in detail by 200 qualified physicists.
The main difference between general relativity and UFT-1, from my perspective, having never examined the arguments for either, is that most scientists believe in general relativity, and most scientists do not believe in UFT-1. Yet, I believe that general relativity is almost definitely right, I believe that UFT-1 is almost definitely wrong, and I believe that these are rational judgments.
Furthermore, these rational judgments are based almost entirely on a popularity-based heuristic: that is, the heuristic that popular beliefs are more likely to be true. To review, from the information I have, the main difference between general relativity and UFT-1 is that a lot of people believe in general relativity, and few people believe in UFT-1. Otherwise they are quite similar: both of them have been around for a while, both of them have received significant exposure, and both of them claim to have sound arguments in their favor. (The differences between these arguments cannot enter into my evaluation of the two theories, because I have not examined the arguments for either.)
This example suggests that popularity-based heuristics, telling us that popular beliefs are more likely to be true, rightly have a place in rational people’s judgments.
This makes sense. The amount of thinking that the human race as a whole has done vastly exceeds the amount of thinking that I will ever do. It would make sense for me to rely on this vast repository of intelligence in choosing my own beliefs. This is related to the idea of “the wisdom of crowds.”
On the other hand, popularity-based heuristics often lead us to the wrong answer. Religion is an obvious example. So we have to be careful in applying them. I’m not sure what general principles would result in our popularity-based heuristics excluding religious beliefs, but including popular scientific theories which we have not evaluated for ourselves. What do you guys think?
I’m not sure what general principles would result in our popularity-based heuristics excluding religious beliefs, but including popular scientific theories which we have not evaluated for ourselves. What do you guys think?
The strength of others’ beliefs as evidence depends on what you know about how they arrived at those beliefs. If you know that scientists have a general process for establishing accepted truths which involves repeated testing with attempts to falsify their hypotheses and find alternative explanations, then you can take established consensus as evidence proportional to your trust in that process. Likewise, if you know that people tend to come to religious consensuses due to early indoctrination and community reinforcement, you should take religious consensuses as evidence proportional to your confidence that those processes will tend to produce true beliefs.
Likewise, if you know that people tend to come to religious consensuses due to early indoctrination and community reinforcement, you should take religious consensuses as evidence proportional to your confidence that those processes will tend to produce true beliefs.
That’s how scientific beliefs become consensus too. It becomes a question of how the doctrine was originally chosen and on what criterion the culture most rewards oneupmanship attempts. ie. You can put more trust in science based indoctrination because you believe that if the powers that be were indoctrinating you with beliefs that can be contradicted via science rituals another power would have an excuse to ridicule them.
The beliefs of other people are evidence of some fashion. In some cases (e.g. scientific consensus), a belief being widely held is a very strong signal of correctness. In other cases (e.g. religion), less so.
Of course, our social instinct to conform do not take into account the reliability of the beliefs of the group that one is part of—although, they do take into account whether you identify yourself as part of that group, which gives one some control (only identify yourself with groups that have a good track-record of correctness.)
I’d be hesitant to classify being either contrarian or conformist as being examples of bias per se. For something to be a bias, it must influence ones beliefs in such a way that is not rationally justified. Being contrarian regarding e.g. the religious beliefs and beliefs stemming from religious beliefs of your parents is, probably, rational; conforming to the beliefs of people with more experience than you working in a field that strongly rewards and punishes success or failure (e.g. stock trading) is, again, probably rational.
Of course, being conformist can be considered to bring great gains in instrumental rationality. A large proportion of the beliefs people hold do not change in any significant way the way they lead their lives, but they do hold a large signalling value—that one is part of a group, and not some insane, socially inept geek that believes in crazy things such as the singularity. Fortunately, it is possible to get almost all of the same benefits of actual conformity by simply pretending to conform; normally one does not even need to lie, just holding ones tongue is often enough. The only advantage I can see to actually conforming is that it may make it easier to empathize with and predict others behaviour in that group, but I don’t think that this is normally much of an advantage.
Though I do tend to be contrarian, I’ve always thought that acting independently from others is the correct stance. Does everyone agree that being contrarian or conformist are both forms of bias to be avoided? I think that at best they can be seen as very weak/indirect reasons to believe something or do something, and only relative to your context. (You need to pick your battles as a contrarian and you need to break from conforming with the wrong people as a conformist)
This is an interesting question. I definitely agree that being a contrarian and being a conformist can both be forms of bias. However, I would add one example which suggests that conformity can in some cases be a positive instinct.
I have never studied general relativity in depth. My belief that “general relativity is right” is based on the heuristics, “most scientists believe in general relativity,” and “things that most scientists believe are usually right.” In part I think it’s also based on the fact that I know that evidence and arguments are available which everybody claims to be very strong.
To show that most of my belief in general relativity comes from popularity-based heuristics, consider the following scenario. Somebody proposes a unified field theory (UFT-1). They claim that evidence and arguments are available which would would convince me that the theory is right. Furthermore, they are the only person who believes in UFT-1. To eliminate further confounding variables, let us suppose that UFT-1 has existed for 35 years and has been examined in detail by 200 qualified physicists.
The main difference between general relativity and UFT-1, from my perspective, having never examined the arguments for either, is that most scientists believe in general relativity, and most scientists do not believe in UFT-1. Yet, I believe that general relativity is almost definitely right, I believe that UFT-1 is almost definitely wrong, and I believe that these are rational judgments.
Furthermore, these rational judgments are based almost entirely on a popularity-based heuristic: that is, the heuristic that popular beliefs are more likely to be true. To review, from the information I have, the main difference between general relativity and UFT-1 is that a lot of people believe in general relativity, and few people believe in UFT-1. Otherwise they are quite similar: both of them have been around for a while, both of them have received significant exposure, and both of them claim to have sound arguments in their favor. (The differences between these arguments cannot enter into my evaluation of the two theories, because I have not examined the arguments for either.)
This example suggests that popularity-based heuristics, telling us that popular beliefs are more likely to be true, rightly have a place in rational people’s judgments.
This makes sense. The amount of thinking that the human race as a whole has done vastly exceeds the amount of thinking that I will ever do. It would make sense for me to rely on this vast repository of intelligence in choosing my own beliefs. This is related to the idea of “the wisdom of crowds.”
On the other hand, popularity-based heuristics often lead us to the wrong answer. Religion is an obvious example. So we have to be careful in applying them. I’m not sure what general principles would result in our popularity-based heuristics excluding religious beliefs, but including popular scientific theories which we have not evaluated for ourselves. What do you guys think?
The strength of others’ beliefs as evidence depends on what you know about how they arrived at those beliefs. If you know that scientists have a general process for establishing accepted truths which involves repeated testing with attempts to falsify their hypotheses and find alternative explanations, then you can take established consensus as evidence proportional to your trust in that process. Likewise, if you know that people tend to come to religious consensuses due to early indoctrination and community reinforcement, you should take religious consensuses as evidence proportional to your confidence that those processes will tend to produce true beliefs.
That’s how scientific beliefs become consensus too. It becomes a question of how the doctrine was originally chosen and on what criterion the culture most rewards oneupmanship attempts. ie. You can put more trust in science based indoctrination because you believe that if the powers that be were indoctrinating you with beliefs that can be contradicted via science rituals another power would have an excuse to ridicule them.
The beliefs of other people are evidence of some fashion. In some cases (e.g. scientific consensus), a belief being widely held is a very strong signal of correctness. In other cases (e.g. religion), less so.
Of course, our social instinct to conform do not take into account the reliability of the beliefs of the group that one is part of—although, they do take into account whether you identify yourself as part of that group, which gives one some control (only identify yourself with groups that have a good track-record of correctness.)
I’d be hesitant to classify being either contrarian or conformist as being examples of bias per se. For something to be a bias, it must influence ones beliefs in such a way that is not rationally justified. Being contrarian regarding e.g. the religious beliefs and beliefs stemming from religious beliefs of your parents is, probably, rational; conforming to the beliefs of people with more experience than you working in a field that strongly rewards and punishes success or failure (e.g. stock trading) is, again, probably rational.
Of course, being conformist can be considered to bring great gains in instrumental rationality. A large proportion of the beliefs people hold do not change in any significant way the way they lead their lives, but they do hold a large signalling value—that one is part of a group, and not some insane, socially inept geek that believes in crazy things such as the singularity. Fortunately, it is possible to get almost all of the same benefits of actual conformity by simply pretending to conform; normally one does not even need to lie, just holding ones tongue is often enough. The only advantage I can see to actually conforming is that it may make it easier to empathize with and predict others behaviour in that group, but I don’t think that this is normally much of an advantage.