Unpopular ideas attract poor advocates: Be charitable
Unfamiliar or unpopular ideas will tend to reach you via proponents who:
…hold extreme interpretations of these ideas.
...have unpleasant social characteristics.
...generally come across as cranks.
The basic idea: It’s unpleasant to promote ideas that result in social sanction, and frustrating when your ideas are met with indifference. Both situations are more likely when talking to an ideological out-group. Given a range of positions on an in-group belief, who will decide to promote the belief to outsiders? On average, it will be those who believe the benefits of the idea are large relative to in-group opinion (extremists), those who view the social costs as small (disagreeable people), and those who are dispositionally drawn to promoting weird ideas (cranks).
I don’t want to push this pattern too far. This isn’t a refutation of any particular idea. There are reasonable people in the world, and some of them even express their opinions in public, (in spite of being reasonable). And sometimes the truth will be unavoidably unfamiliar and unpopular, etc. But there are also...
Some benefits that stem from recognizing these selection effects:
It’s easier to be charitable to controversial ideas, when you recognize that you’re interacting with people who are terribly suited to persuade you. I’m not sure “steelmanning” is the best idea (trying to present the best argument for an opponent’s position). Based on the extremity effect, another technique is to construct a much diluted version of the belief, and then try to steelman the diluted belief.
If your group holds fringe or unpopular ideas, you can avoid these patterns when you want to influence outsiders.
If you want to learn about an afflicted issue, you might ignore the public representatives and speak to the non-evangelical instead (you’ll probably have to start the conversation).
You can resist certain polarizing situations, in which the most visible camps hold extreme and opposing views. This situation worsens when those with non-extreme views judge the risk of participation as excessive, and leave the debate to the extremists (who are willing to take substantial risks for their beliefs). This leads to the perception that the current camps represent the only valid positions, which creates a polarizing loop. Because this is a sort of coordination failure among non-extremists, knowing to covertly look for other non-vocal moderates is a first step toward a solution. (Note: Sometimes there really aren’t any moderates.)
Related to the previous point: You can avoid exaggerating the ideological unity of a group based on the group’s leadership, or believing that the entire group has some obnoxious trait present in the leadership. (Note: In things like elections and war, the views of the leadership are what you care about. But you still don’t want to be confused about other group members.)
I think the first benefit listed is the most useful.
To sum up: An unpopular idea will tend to get poor representation for social reasons, which will makes it seem like a worse idea than it really is, even granting that many unpopular ideas are unpopular for good reason. So when you encounter a idea that seem unpopular, you’re probably hearing about it from a sub-optimal source, and you should try to be charitable towards the idea before dismissing it.
- Zen and Rationality: Trust in Mind by 11 Aug 2020 20:23 UTC; 25 points) (
- 16 Sep 2014 19:16 UTC; 3 points) 's comment on Open Thread by (EA Forum;
Your analysis has implications not only for individuals exposed to unpopular ideas, but also for movements promoting such ideas. These movements (e.g., effective altruism) should be particularly worried about their ideas being represented inadequately by its most radical, disagreeable or crazy members, and should spend their resources accordingly (e.g. by prioritizing outreach activities, favoring more mainstream leaders, handling media requests strategically, etc.).
Reminds me of my youth, when I was a big fan of Esperanto. There was one mentally not-completely-balanced man in our country who was totally obsessed about this great idea, and kept sending letters to all media, over and over again. Of course he achieved nothing; his letters didn’t even make sense. The only real effect was that when we tried to promote something we did in the media, most people after hearing the word immediately remembered this guy and refused to even talk with us.
So yeah, a stupid ally is sometimes worse than an enemy.
As an example, MIRI seems to have effectively taken this route.
The negation that “Popular ideas attract disproportionally good advocates” seems also worth attention. People accept sloppy thinking a lot more readily if they agree with the conlusion. This can be used as a dark art where you present a sloppy thinking argument for obvious truth or uplifting conlusion and then proceed to use the same technique to support the payload. The target is less likely to successfully deploy resistance.
Also it’s quite often that a result that is produced in a rigorous way is rederived in a sloppy way by those that are told about the result.
That explains theology.
In reality there is both sophisticated theology and a scary weirdo on the corner with the “REPENT THE END IS NEAR” sign.
My experience is the opposite. The worst advocates tend toward the popular ideas.
After all, they became the worst advocates by a complete inability to think straight. So they tend to pick their ideas to champion by popularity.
Also, the majority is the audience for everyone, just because it is big. People defending the mainstream grandstand to the mainstream, while people with rare views try to recruit from from the mainstream. People with rare views know what the counterarguments are going to be; they pass the ideological Turing test. “privilege”
Please give a few concrete examples.
I am not the original poster, but most people advocating for the idea that races have genetic differences in IQ are racists, because non-racists don’t dare say so in most contexts.
This may also apply in Europe to people opposing immigration—only racists would dare say so because they’re so marginalized that they don’t take additional hits from it.
Taboo “racist”.
Do you believe it’s unclear from context what I meant?
No, actually I don’t. In fact I still don’t understand what you meant.
Well, actually most people advocating for the idea that races have genetic differences in IQ are racists because that idea falls under the standard definition of racism.
It falls under a definition of racism, but another definition is “hatred or intolerance of another race or other races.”
Let’s ask the hive mind :-) Google, what is the definition of racism? Google says:
Genetic IQ differences clearly qualify.
Note that this is a subject so fraught with subjectivity that Wiktionary had to include half a page of usage notes. I don’t think arguing semantics is going to get anyone very far.
Agreed, I would argue that at this point the word “racism” has no coherent meaning, whether it ever had a coherent meaning is open to debate.
As with many other words — such as “liberal” and “set” — it has rather a lot of meanings and if you are either ① unsure of which one someone means, or ② think you know which one someone means but that meaning makes their sentence ridiculously false, then you are better off asking for clarification than guessing.
The problem is not that “racism” has no coherent meaning. No word carries inherent meaning; and many words quite safely carry multiple or ambiguous meanings without causing problems, because hearers don’t panic and throw elementary principles of decent communication out the window when they hear them.
When someone says “set” and a hearer isn’t sure whether they mean “set” in the Zermelo-Fraenkel sense or the game sense), the hearer typically asks.
But when someone says “racism”, many hearers are likely to react incredibly poorly, even exhibiting the physiological responses of a person who is threatened or becoming enraged.
We might better ask, “Why do they respond so badly to this particular word?” I suspect the answer has a lot to do with fear of being accused of something vile. And I suggest that the poor rationality practice is at least as much on the part of hearers who let this reaction run away with them instead of finding out what is meant, as on the part of speakers who use the word without further explanation.
I thought the definition that someone got from Google elsewhere in the thread was fine. The only thing that definition leaves out is what people believe about the claim that “racism” labels. Some believe that it is true and some believe that it is false, the strength of their belief either way varying in proportion to their desire to exclude from discussion the question, “is this true or false?”
Generally, they are being accused of a belief that their accuser thinks is vile, so vile that the very question of whether it is true is also vile, so vile that it must never be discussed, and it is quite clear without further explanation that that is what is meant.
I think “mindkill” is a better term here.
Genetic IQ differences clearly qualify as something that ALL members of EACH race possess that is SPECIFIC to that race?
That definition is really quite strong. Not even a belief that all black people suffer from some degree of mental retardation would satisfy it. The belief that there are genes correlated with lower IQs that are more prevalent among black people certainly would not.
Anyone want to explain what they found wrong with my comment?
Yeah this is definitely false. Reality is ‘racist’ I refuse to fall under such a negative category. Most people in the sensical camp of individuals who can respect individual genetic differences would also seek to abolish it and give every one a fair chance if we can somehow engineer a superior outcome.
This in-group/out group stuff gets very tiresome when it is used in a sloppy fashion. It does not correct for the percentage of people who “don’t give a fuck”. There’s a difference between LessWrong language use in a decent form and mere abuse.
What do you mean by “abolish it”? Do you mean replace all people with identical clones so that no one is smarter (or stronger or has more willpower) than anyone else? Or are individual differences only a problem as long as they correlate with race?
What are you talking about? I meant once you accept it, we can do somerthing about it. There’s no reason to be destructive just because we can recognize reality. Please stop linking to articles, every one has developed this poor habit. I already accept most of the conclusions you believe.
I was trying to get across that you can be sanguine, while ackowledging the reality that exists and looking for ways around this. Such as countering dysgenics, etc.
Calling people who recognize racial correlations with intelligence racist is an incorrect appropriation of the term and is stretching ‘racist’ to mean something it shouldn’t it’s a weird trivial sort of technical correctness that is mostly irrelevant. I also think Jiro makes a wildly incorrect claim that most people “wouldn’t dare say so in some contexts”. Every one is abusing this in-group/out-group idea, it’s a defective tool in this example. There’s no reason to have a huge discussion about “Unpopular idea’s attract poor advocates”. The original post stands on largely nothing and there’s no reason for every one to accept it on a whim and be applying it everywhere.
It’s like there’s some sophisticated markov generator that makes you speak less-wrongesque that aims to maximize insular language while being devoid of content.
I think a lot of people will disagree.
So, try declaring in a mainstream public forum that races have significantly different gene-based IQ (I recommend a disposable nym for that). Listen to the names you will be called, see how many commenters will be inclined to exhibit the “weird trivial sort of technical correctness”...
I have done this. People are unskilled at execution. It’s not simple and it takes a bit of care, you have to display empathy that you are uncomfortable with the conclusions, and that it isn’t something that you are happy or want to believe, and that if any one is ever going to provide a solution to give every one a better chance, then we will not get there with making it a crime to think this and organize around it. They just want assurance that you’re not the person they read about in the history books.
Seems to me they would want much more, starting with your head on a stick.
The problem is that many of the people “they read about in the history books” did indeed have accurate views on race. Which means the only way to reassure them that you’re not that person is to either lie to them about your beliefs or have inaccurate beliefs.
Feminism has Jezebel, xoJane, much of Tumblr, and that one girl on your Facebook feed.
I was mostly reflecting on a pattern in the people I’ve met, so most of my examples won’t be persuasive.
Musing on some less personal examples:
Religious missionaries are selected for atypical faith and a resistance to “leave me alone” social cues. For many people, talking to a more moderate believer would lead to a greater shift in opinion. (Not that the goals of missionary activity are to convince the average person).
People who explicitly advocate for utilitarianism tend to care enough about the system to “bite the bullet” on certain issues, scaring away newcomers. (Peter Singer on bestiality and infanticide, Eliezer on dust specks and torture). People who are vaguely utilitarian but not too concerned about consistency can almost certainly do a better job convincing a non-utilitarian that they should be a bit more utilitarian.
Kant had some actually useful ethical insights, but said some downright stupid things in the application of his ideas (like: You shouldn’t lie to a murderer who comes knocking at your door looking for a victim, but you should “be silent” or something).
If you’re a progressive with a progressive social circle and want to learn about critiques of progressivism or about conservative thought in general, neo-reaction is about the worst starting place ever. It’s like a conservative in the deep south trying to learn about the political left by reading Marx and browsing Tumblr.
The “highest quality” non-commercial strength training materials (meaning they looked really shiny and had the most investment sunk into them) are often extreme and “purity” minded, and were produced by people who trigger the “crank” and “how did THIS become a central feature of your identity?” red flags.
Health innovations in general (some of which I’ve adopted, albeit in a less extreme form) tend to spread fastest via apocalyptic messengers (“fructose is literally poison”), new-age people, or the self-experiment crowd, (who don’t upset me, but would strike many people as cranks).
You seem to be conflating two things. People who give logically bad arguments for their positions, and people who say things that trigger listeners absurdity heuristic.
The more radical positions tend to be more logically coherent, hence easier to logically defend. On the other hand they’re also more likely to trigger people’s absurdity heuristics.
More moderate positions are harder to defend since you wind up jumping through hoops to explain why your logic doesn’t apply to certain cases. This means that in practice the more moderate position functions as a Trojan Horse for the more radical position.
Part of the issue is that what people precise as “crank” is heavily influenced by what’s popular.
When I was first learning Kant I also thought this was stupid. But now after thinking about it a lot more I can see how this makes sense from a game theoretic point of view. If you model the murder as a rational agent with a different utility function, lying isn’t a Nash equilibrium, being silent is.
Thanks! Those are good examples, actually. Probably worth adding to the OP.
Are you one of those awful bestialityphobes or something? :)
NRx got Mike Anissimov, Nick Land, and so on. Even Moldbug wasn’t a terribly polished writer.
I know Moldbug is a notorious windbag, but from the little I’ve read of Anissimov’s writings he seems clear and engaging.
The category was “poor advocates of NRx” not “poor writers of NRx.”
I imagine Anissimov is engaging when you already subscribe to his memeplex, but he’s terrible at engaging with others—by design! In his view, NRx is weakened as its popular support increases.
Side note: I did read the linked piece, and thought it was quite good, even though I think that neoreaction is fundamentally misguided and potentially disastrous if it ever becomes dominant.
Point taken about the Writing Abilities vs. PR Abilities distinction, which I was steamrolling over.
However, while I don’t have much information about Anissimov’s public relations abilities (actually, I’m just seeing him pop up on twitter—whatever he’s doing with those screenshots of 4chan and jokes(?) about using Ebola as a biological weapon [edit: my comprehension fail, see below], it’s probably not a brilliant PR campaign. But I digress), the two links you provided definitely seem to be about projecting a unified public front for the movement and disassociating it from people who confirm generally inaccurate negative stereotypes about it. Suggesting that he’s opposed to more people supporting neoreaction in general on that basis seems disingenuous. Nobody wants an eternal September or to look like morons in the mainstream press, but that doesn’t mean that they’re actively sabotaging their public image or trying not to recruit.
Examples? Of Anissimov recruiting?
Sorry, just to be clear, I wasn’t necessarily disputing your original point, as I don’t really know that much about Anissimov. I was just pointing out that the links you provided weren’t supporting your extraordinary claim that he actively decries the practice of people joining his movement.
What you’ve written here is not what I claimed three comments ago.
I normally wouldn’t care if random person X on the internet thinks I’m wrong about Anissimov, but I’m really tired of people gaslighting me on this. So here is your “extraordinary” evidence that Anissimov believes his movement is weakened by popular support.
To recap, I relayed two separate essays of his in which he holds this value. Emphasis added everywhere by me.
1) “Boundaries”:
We can quibble about what these “minimum standards” are, but evidentally an upper bound on the amount of disagreement possible is given by the whole Justine Tunney incident.
2) “The Kind of People Who Should Be Nowhere Near Neoreaction”
Remember my claim earlier was:
You’ve claimed that he’s only concerned about NRx’s public reputation. To the contrary, he says quite clearly:
3) “Social Conservatism and Drawing a Line in the Sand”
Here we have a more specific version of “NRx’ers must believe at least this much, or else they cannot be called NRx’ers”:
I feel this should satisfy any reasonable evidential standards to conclude the claim I actually made. Feel free to disagree with me substantially after actually reading Anissimov for yourself.
Just because I’m setting boundary conditions does not mean I am generally discouraging people from involvement, that doesn’t follow. However, it’s true that there’s an optimal recruitment rate which is substantially less than the maximum possible recruitment rate. Recruitment rates can be deliberately throttled back by introducing noise, posting more rarely, and through other methods.
NRx would be maximally strengthened if it could recruit as many people as possible while maintaining quality, but realistically we can only add a given number of people per week without compromising quality.
I’m fantastic at engaging others by design, I openly offer to publicly debate people, only Noah Smith has taken me up on it so far.
Re: ebola, I’ve never joked about using it as a biological weapon, I’m just responding to the funny meming that’s going on on 4chan about ebola.
I’m tapping out now, sorry.
You should have heard of religious or other extremists.
People with unpleasant characteristics are probably the most common, but than mostly you encouter them with less extreme positions.
Cranks can be found via http://www.crank.net/science.html
Trotskyism is represented in the US by the Spartacist League. The members do not seem well adjusted socially.
Interesting post!
I have a feeling like there is a deep connection between this and the evaporative cooling effect (more moderate members of a group are more likely to leave when a group’s opinion gets too extreme, thereby increasing the ratio of extremists and making the group even more extreme). Like there ought to be a social theory that explains both effects. I can’t quite put my finger on it, though. Any ideas?
To address the “meta” of this...
Is this the position of being charitable to unpopular ideas an unpopular idea? I’m not even sure how you would measure “popularity” of an idea objectively in a world where OCR scripts and bots can hack SEO and push social media memes without a single human interaction...
And using Western psychological models regarding the analysis of such things is certainly bound to be rife with the feedback of cultural bias...
And is my response an unpopular idea?
Because of these issues, I find the reasoning demonstrated here to be circular. This premise requires a more rigorous definition of the term “popularity” which, as far as I can tell, cannot be done objectively since the concept is extremely context sensitive.
I think what the idea in the post does is that it gets at the curvature of the space, so to speak.
On one hand, unpopular ideas are disproportionately likely to be advocated by disagreeable people. On the other hand, those who hold unpopular positions often have to defend their views and be familiar with the opinions of their opponents, while the proponents of popular views may not be familiar with the arguments for unpopular views. For example, outspoken atheists are likely to be more disagreeable, but they’re also more likely to be familiar with religious people’s arguments than the typical religious person is familiar with arguments for atheism.
Atheism is your example of an unpopular idea..?
Not at all unpopular on LW. Very unpopular in some other contexts, e.g., US society at large.
Interesting post!
Another reason to be charitable: these “poor advocates”, by virtue of being marginalized/unpopular/cranks may have fewer disincentives to say “the emperor has no clothes”, because their standing is already low. Once they put an idea out there, it may gain traction with a greater chunk of the population. Unfortunately, this dynamic leads to “autism is caused by vaccines” movements too.
If you’re interested in the topic I highly recommend this BloggingHeads episode: http://bloggingheads.tv/videos/30467 specifically the “emperor has no clothes” and “tokenism” sections (there are links to those segments under the video.
Great. I added this to my list of life-lessons.
Also: This is related to Dangers of steelmanning / principle of charity
Interesting and useful post!
But on your last bullet, you seem to be conflating ‘leadership’ with ‘people presenting the idea’. I’m not sure they are always the same thing: the ‘leaders’ of any group are quite often going to be there because they’re good at forging consensus and/or because they have general social/personal skills that stop them appearing like cranks.
Take a fringe political party: I would guess that people promoting that party down the pub or in online comments on newspaper websites or whatever are more likely to be the sort of advocate you describe. But in all but the smallest fringe parties, you’d expect the actual leadership to have rather more political skill.
I think this is a super post!
That depends very much on the company that you hold. If you are friends with a bunch of people who like to argue contrarian positions than it’s not true.
I think this post ought to be in Main, actually.
What is up with main?
The recent posts to main right now are: panhandling, awkward paeans to rationality, and impressive technical posts that I don’t understand. (And a meet-up post).
It’s been that way for a while. The standards are so high that there’s rarely any content there that wasn’t gandfathered in.