Unfamiliar or unpopular ideas will tend to reach you via proponents who:
…hold extreme interpretations of these ideas.
...have unpleasant social characteristics.
...generally come across as cranks.
The basic idea: It’s unpleasant to promote ideas that result in social sanction, and frustrating when your ideas are met with indifference. Both situations are more likely when talking to an ideological out-group. Given a range of positions on an in-group belief, who will decide to promote the belief to outsiders? On average, it will be those who believe the benefits of the idea are large relative to in-group opinion (extremists), those who view the social costs as small (disagreeable people), and those who are dispositionally drawn to promoting weird ideas (cranks).
I don’t want to push this pattern too far. This isn’t a refutation of any particular idea. There are reasonable people in the world, and some of them even express their opinions in public, (in spite of being reasonable). And sometimes the truth will be unavoidably unfamiliar and unpopular, etc. But there are also...
Some benefits that stem from recognizing these selection effects:
It’s easier to be charitable to controversial ideas, when you recognize that you’re interacting with people who are terribly suited to persuade you. I’m not sure “steelmanning” is the best idea (trying to present the best argument for an opponent’s position). Based on the extremity effect, another technique is to construct a much diluted version of the belief, and then try to steelman the diluted belief.
If your group holds fringe or unpopular ideas, you can avoid these patterns when you want to influence outsiders.
If you want to learn about an afflicted issue, you might ignore the public representatives and speak to the non-evangelical instead (you’ll probably have to start the conversation).
You can resist certain polarizing situations, in which the most visible camps hold extreme and opposing views. This situation worsens when those with non-extreme views judge the risk of participation as excessive, and leave the debate to the extremists (who are willing to take substantial risks for their beliefs). This leads to the perception that the current camps represent the only valid positions, which creates a polarizing loop. Because this is a sort of coordination failure among non-extremists, knowing to covertly look for other non-vocal moderates is a first step toward a solution. (Note: Sometimes there really aren’t any moderates.)
Related to the previous point: You can avoid exaggerating the ideological unity of a group based on the group’s leadership, or believing that the entire group has some obnoxious trait present in the leadership. (Note: In things like elections and war, the views of the leadership are what you care about. But you still don’t want to be confused about other group members.)
I think the first benefit listed is the most useful.
To sum up: An unpopular idea will tend to get poor representation for social reasons, which will makes it seem like a worse idea than it really is, even granting that many unpopular ideas are unpopular for good reason. So when you encounter a idea that seem unpopular, you’re probably hearing about it from a sub-optimal source, and you should try to be charitable towards the idea before dismissing it.
Unpopular ideas attract poor advocates: Be charitable
Unfamiliar or unpopular ideas will tend to reach you via proponents who:
…hold extreme interpretations of these ideas.
...have unpleasant social characteristics.
...generally come across as cranks.
The basic idea: It’s unpleasant to promote ideas that result in social sanction, and frustrating when your ideas are met with indifference. Both situations are more likely when talking to an ideological out-group. Given a range of positions on an in-group belief, who will decide to promote the belief to outsiders? On average, it will be those who believe the benefits of the idea are large relative to in-group opinion (extremists), those who view the social costs as small (disagreeable people), and those who are dispositionally drawn to promoting weird ideas (cranks).
I don’t want to push this pattern too far. This isn’t a refutation of any particular idea. There are reasonable people in the world, and some of them even express their opinions in public, (in spite of being reasonable). And sometimes the truth will be unavoidably unfamiliar and unpopular, etc. But there are also...
Some benefits that stem from recognizing these selection effects:
It’s easier to be charitable to controversial ideas, when you recognize that you’re interacting with people who are terribly suited to persuade you. I’m not sure “steelmanning” is the best idea (trying to present the best argument for an opponent’s position). Based on the extremity effect, another technique is to construct a much diluted version of the belief, and then try to steelman the diluted belief.
If your group holds fringe or unpopular ideas, you can avoid these patterns when you want to influence outsiders.
If you want to learn about an afflicted issue, you might ignore the public representatives and speak to the non-evangelical instead (you’ll probably have to start the conversation).
You can resist certain polarizing situations, in which the most visible camps hold extreme and opposing views. This situation worsens when those with non-extreme views judge the risk of participation as excessive, and leave the debate to the extremists (who are willing to take substantial risks for their beliefs). This leads to the perception that the current camps represent the only valid positions, which creates a polarizing loop. Because this is a sort of coordination failure among non-extremists, knowing to covertly look for other non-vocal moderates is a first step toward a solution. (Note: Sometimes there really aren’t any moderates.)
Related to the previous point: You can avoid exaggerating the ideological unity of a group based on the group’s leadership, or believing that the entire group has some obnoxious trait present in the leadership. (Note: In things like elections and war, the views of the leadership are what you care about. But you still don’t want to be confused about other group members.)
I think the first benefit listed is the most useful.
To sum up: An unpopular idea will tend to get poor representation for social reasons, which will makes it seem like a worse idea than it really is, even granting that many unpopular ideas are unpopular for good reason. So when you encounter a idea that seem unpopular, you’re probably hearing about it from a sub-optimal source, and you should try to be charitable towards the idea before dismissing it.