Belief as Attire
I have so far distinguished between belief as anticipation-controller, belief in belief, professing and cheering. Of these, we might call anticipation-controlling beliefs “proper beliefs” and the other forms “improper belief”. Proper belief can be wrong or irrational, as when someone genuinely anticipates that prayer will cure their sick baby. But the other forms are arguably “not belief at all.”
Yet another form of improper belief is belief as group identification—as a way of belonging. Robin Hanson uses the excellent metaphor of wearing unusual clothing, a group uniform like a priest’s vestments or a Jewish skullcap, and so I will call this “belief as attire.”
In terms of humanly realistic psychology, the Muslims who flew planes into the World Trade Center undoubtedly saw themselves as heroes defending truth, justice, and the Islamic Way from hideous alien monsters a la the movie Independence Day. Only a very inexperienced nerd, the sort of nerd who has no idea how non-nerds see the world, would say this out loud in an Alabama bar. It is not an American thing to say. The American thing to say is that the terrorists “hate our freedom” and that flying a plane into a building is a “cowardly act.” You cannot say the phrases “heroic self-sacrifice” and “suicide bomber” in the same sentence, even for the sake of accurately describing how the Enemy sees the world. The very concept of the courage and altruism of a suicide bomber is Enemy attire—you can tell, because the Enemy talks about it. The cowardice and sociopathy of a suicide bomber is American attire. There are no quote marks you can use to talk about how the Enemy sees the world; it would be like dressing up as a Nazi for Halloween.
Belief-as-attire may help explain how people can be passionate about improper beliefs. Mere belief in belief, or religious professing, would have some trouble creating genuine, deep, powerful emotional effects. Or so I suspect; I confess I’m not an expert here. But my impression is this: People who’ve stopped anticipating-as-if their religion is true, will go to great lengths to convince themselves they are passionate, and this desperation can be mistaken for passion. But it’s not the same fire they had as a child.
On the other hand, it is very easy for a human being to genuinely, passionately, gut-level belong to a group, to cheer for their favorite sports team.1 Identifying with a tribe is a very strong emotional force. People will die for it. And once you get people to identify with a tribe, the beliefs which are the attire of that tribe will be spoken with the full passion of belonging to that tribe.
1 This is the foundation on which rests the swindle of “Republicans vs. Democrats” and analogous false dilemmas in other countries, but that’s a topic for another time.
- EA is three radical ideas I want to protect by 27 Mar 2023 15:31 UTC; 638 points) (EA Forum;
- How to Be Happy by 17 Mar 2011 7:22 UTC; 234 points) (
- Believing In by 8 Feb 2024 7:06 UTC; 230 points) (
- The Intelligent Social Web by 22 Feb 2018 18:55 UTC; 229 points) (
- Have epistemic conditions always been this bad? by 25 Jan 2020 4:42 UTC; 206 points) (
- Less Wrong NYC: Case Study of a Successful Rationalist Chapter by 17 Mar 2011 20:12 UTC; 188 points) (
- A Sense That More Is Possible by 13 Mar 2009 1:15 UTC; 161 points) (
- Urges vs. Goals: The analogy to anticipation and belief by 24 Jan 2012 23:57 UTC; 126 points) (
- What I’ve learned from Less Wrong by 20 Nov 2010 12:47 UTC; 113 points) (
- A summary of every “Highlights from the Sequences” post by 15 Jul 2022 23:01 UTC; 97 points) (
- Of Two Minds by 17 May 2018 4:34 UTC; 94 points) (
- An Outside View on Less Wrong’s Advice by 7 Jul 2011 4:46 UTC; 84 points) (
- Fake Optimization Criteria by 10 Nov 2007 0:10 UTC; 72 points) (
- Curating “The Epistemic Sequences” (list v.0.1) by 23 Jul 2022 22:17 UTC; 65 points) (
- 1 May 2012 13:06 UTC; 53 points) 's comment on Rationality Quotes May 2012 by (
- A summary of every “Highlights from the Sequences” post by 15 Jul 2022 23:05 UTC; 47 points) (EA Forum;
- Let Your Mind Be Not Fixed by 31 Jul 2020 17:54 UTC; 46 points) (
- Conspiracy Theories as Agency Fictions by 9 Jun 2012 15:15 UTC; 44 points) (
- [Intro to brain-like-AGI safety] 9. Takeaways from neuro 2/2: On AGI motivation by 23 Mar 2022 12:48 UTC; 44 points) (
- Noisy Poll Results And Reptilian Muslim Climatologists from Mars by 12 Apr 2013 10:49 UTC; 42 points) (
- Lighthaven Sequences Reading Group #4 (Tuesday 10/01) by 25 Sep 2024 5:48 UTC; 36 points) (
- An unofficial “Highlights from the Sequences” tier list by 5 Sep 2022 14:07 UTC; 29 points) (
- Book Review: Free Will by 11 Oct 2021 18:41 UTC; 28 points) (
- You Don’t Have To Click The Links by 11 Sep 2022 18:13 UTC; 25 points) (
- Seeking models of LW’s aversion to religion by 21 Feb 2022 18:54 UTC; 24 points) (
- Doxa, Episteme, and Gnosis Revisited by 20 Nov 2019 19:35 UTC; 19 points) (
- What are some good examples of fake beliefs? by 14 Nov 2020 7:40 UTC; 18 points) (
- 23 Dec 2011 12:53 UTC; 17 points) 's comment on Ritual Report: NYC Less Wrong Solstice Celebration by (
- 22 Jan 2011 7:06 UTC; 15 points) 's comment on The Illusion of Sameness by (
- 10 May 2013 5:01 UTC; 15 points) 's comment on Open Thread, May 1-14, 2013 by (
- 17 Jun 2020 3:27 UTC; 14 points) 's comment on Mod Notice about Election Discussion by (
- 7 Nov 2012 10:22 UTC; 14 points) 's comment on Please don’t vote because democracy is a local optimum by (
- Rationality Reading Group: Fake Beliefs (p43-77) by 7 May 2015 9:07 UTC; 14 points) (
- [SEQ RERUN] Religion’s Claim to be Non-Disprovable by 28 Jun 2011 2:54 UTC; 10 points) (
- 13 May 2011 15:45 UTC; 10 points) 's comment on The elephant in the room, AMA by (
- [SEQ RERUN] Belief as Attire by 27 Jun 2011 1:51 UTC; 9 points) (
- 2 Apr 2013 18:32 UTC; 8 points) 's comment on Explaining vs. Explaining Away by (
- The Reality of Emergence by 19 Aug 2017 21:58 UTC; 8 points) (
- 30 Nov 2016 12:56 UTC; 7 points) 's comment on Open thread, Nov. 28 - Dec. 04, 2016 by (
- Making Beliefs Pay Rent by 28 Jul 2024 17:59 UTC; 7 points) (
- Rationality outreach vs. rationality teaching by 26 Dec 2023 0:37 UTC; 7 points) (
- 29 Dec 2017 17:05 UTC; 6 points) 's comment on Maps vs Buttons; Nerds vs Normies by (
- Being Correct as Attire by 24 Oct 2017 10:04 UTC; 5 points) (
- 21 May 2022 7:39 UTC; 5 points) 's comment on David Udell’s Shortform by (
- 24 Sep 2011 15:40 UTC; 5 points) 's comment on Rationality Quotes September 2011 by (
- Meetup : Buffalo Meetup by 14 Feb 2013 18:42 UTC; 5 points) (
- 27 Aug 2012 3:10 UTC; 5 points) 's comment on Religion’s Claim to be Non-Disprovable by (
- 30 Jan 2014 20:04 UTC; 5 points) 's comment on 2013 Survey Results by (
- 7 Apr 2011 16:51 UTC; 5 points) 's comment on Autism and Lesswrong by (
- 5 Jun 2012 14:39 UTC; 5 points) 's comment on How can I argue without people online and not come out feeling bad? by (
- Meetup : Pittsburgh: Belief as attire by 26 Oct 2012 23:24 UTC; 4 points) (
- 26 Nov 2011 21:14 UTC; 4 points) 's comment on Video: Skepticon talks by (
- Meetup : Ohio Monthly by 23 Feb 2012 23:14 UTC; 4 points) (
- 16 Dec 2012 17:24 UTC; 4 points) 's comment on Can the Chain Still Hold You? by (
- 22 Feb 2012 18:15 UTC; 4 points) 's comment on Rationality Quotes February 2012 by (
- 21 Apr 2024 15:17 UTC; 3 points) 's comment on “You’re the most beautiful girl in the world” and Wittgensteinian Language Games by (
- 15 Jul 2010 17:27 UTC; 3 points) 's comment on Financial incentives don’t get rid of bias? Prize for best answer. by (
- 31 Dec 2013 12:58 UTC; 3 points) 's comment on Critiquing Gary Taubes, Part 3: Did the US Government Give Us Absurd Advice About Sugar? by (
- The importance of Not Getting the Joke by 17 Jul 2011 9:34 UTC; 3 points) (
- Meetup : Bi-weekly Frankfurt Meetup by 19 Jul 2015 8:22 UTC; 2 points) (
- 11 Oct 2007 23:04 UTC; 2 points) 's comment on A Priori by (
- 9 Oct 2019 20:20 UTC; 2 points) 's comment on Open & Welcome Thread—October 2019 by (
- 9 Oct 2015 15:12 UTC; 1 point) 's comment on Emotional tools for the beginner rationalist by (
- 17 Oct 2023 12:29 UTC; 1 point) 's comment on Announcing MIRI’s new CEO and leadership team by (
- The Reality of Emergence by 4 Oct 2017 8:11 UTC; 0 points) (
- 23 Jul 2011 4:41 UTC; 0 points) 's comment on Those who aspire to perfection by (
- 1 Sep 2015 6:45 UTC; 0 points) 's comment on Typical Sneer Fallacy by (
- 4 Aug 2015 21:33 UTC; -1 points) 's comment on The horrifying importance of domain knowledge by (
- The Futility of Intelligence by 15 Mar 2012 14:25 UTC; -10 points) (
- Where “the Sequences” Are Wrong by 7 May 2023 20:21 UTC; -15 points) (
Sounds like someone needs to examine his bias re Alabama bar patrons.
Paul, I looked up a list of the most religious states in the US. But if you actually go into an Alabama bar and say it, I’ll change the post (not recommended).
I’m not about to put my money where my mouth is on that one
Sounds like someone’s beliefs aren’t paying rent.
Or maybe it’s a matter of existential risk? If there’s a 1⁄10 chance of him being horribly wrong, then I don’t particularly blame him for not testing it. I might believe quite thoroughly, but not want to test it when the explosive is directly in front of me.
I’d happily test it from behind a blast wall, though.
I thought this site would be the last place I’d see criticism of the “suicide bomber as cowardly” notion. Under some definitions, sure, doing something you expect to result in your death, in pursuit of a higher goal, necessarily counts as courage. However, it would be justifiable to say they are intellectually cowardly. That is, rather than advance their ideas through persuasion, and suffer the risk that they may be proven wrong and have to update their worldview; rather than face a world where their worldview is losing, they “abandoned” the world and killed a lot of their intellectual adversaries.
It is an escape. There is, after all, no “refutation” for “I’m right because I’m blowing up myself and you”.
It’s for the same reason one might apply the “coward” label to a divorced, jealous husband, who tries to “get back” at his ex-wife by killing her or their child. He, too, exposes himself to immense risk (incarceration, or if they defend themselves). He too, is pursuing a broader goal. Yet in that case, my calling him a coward is not an artifact of my disagreement with his claim that he has legitimate grievances—in fact, I might very well be on his side (i.e., that the courts did not properly adjudicate his claim).
So yes, it might be the “American” thing to say terrorists are cowardly—but that doesn’t make the claimant biased or wrong.
Is that what extremist Americans mean when they say cowardly?
No, that’s probably just belief as attire. My point is just that reasonable interpretations of “Suicide bombers are cowardly” allow the statement to be true, even if people don’t mean the true version, or if they came to that conclusion for the wrong reason.
Welcome to Less Wrong! Feel free to introduce yourself on that thread. Don’t hesitate to browse the recommendations from the About page or start in on the Sequences. Kaj_Sotala also posted a first and a second list of favorite posts, which are also quite good.
Your point is a good one—I don’t know if you read The Bottom Line (or Rationalization, the followup), but they make a similar point in a well-phrased way you might enjoy.
Except that based on videos and letters left behind, the hijackers considered Americans to be not just intellectual adversaries, but wartime ones. I believe the majority of the hijackers cited American military presence in the Middle East and military and economic support of Israel to that effect.
So what were the specific arguments they used when persuading acolytes of the great satan that their position has more merit? Or was it confined to “BOOM!”?
My point is that using violence to silence intellectual adversaries is very different from using violence against a perceived wartime enemy.
Their ideology might be intellectually cowardly. But sacrificing your own life in battle against a perceived enemy is not a cowardly act. When people call the attacks cowardly, they’re talking about the attacks themselves, not the worldview of the attackers.
I think most non-LWers who refer to the attacks as cowardly mean that they were conducted against unresisting, nonmilitary, targets. The people killed couldn’t fight back (or at least weren’t expected to fight back), and attacking someone who isn’t expected to fight back is widely seen as cowardly.
In this case, of course, other aspects of the operation were hazardous to the terrorists even if they didn’t expect anyone to fight back, but I believe most people who consider the attack as cowardly are treating these aspects separately.
I think actually you’re a bit confused about the difference between instrumental virtues, like courage, and inherent virtues, like benevolence. (Which list “rationality” goes on is actually a tricky one for me. In a certain sense, Stalin seems terrifyingly rational.)
I guess we could talk about “intellectual courage” versus “physical courage” or something like that, and your argument is that these men were not intellectually courageous. But usually when people say “courage” simpliciter, they mean a willingness to act in spite of a high risk of pain and death. And this the hijackers definitely had!
Indeed, there’s something truly terrifying about the Al Qaeda hijackers: They were mostly right about their moral values. They were altruistic, courageous, devoted to duty. It’s only this very small deviation—”maximize deference to Islam” instead of “maximize human happiness”—that made them do such terrible things.
This also meshes with what we know about the Milgram and Zimbardo experiments; quite ordinary people, if convinced that they are acting toward a higher moral purpose, will often do horrific things. The average Nazi was not a psychopath, not a madman; he believed that what he was doing was right. And this should be the most chilling fact of all.
I suspect that the Muslim hijackers, in a strange way, thought they were maximizing human happiness by removing Americans from the world.
I think it more likely they thought they were doing the will of Allah. Happiness? Happiness is for pigs.
Well that explains the no bacon and pork rule.
These is not about whether they are cowardly or brave, and not about at which level are they cowardly or brave. This is not even about whether they see themselves as cowardly or brave.
This is about not being able to talk about how they see themselves for fearing the scorn of the tribe.
I used to assume (possibly through overapplied principle of charity) that the accusations of cowardice had to do with their “escaping” the consequences of their actions by dying, especially if they anticipated heaven.
Specifically, I wonder how comparatively scared they’d have been at the prospect of:
Surviving through being captured and extrajudicially detained
Surviving through being captured and subjected to a nationally televised trial
Destroying the towers, but somehow surviving long enough to be trapped in the wreckage with a dying Muslim girl who has no idea what’s happening
Being given a teleporter they could use to escape just before the impact, knowing that each of their compatriots had refused the same offer
Being ordered to destroy the towers by firing a super rocket launcher in broad daylight in full view of bystanders
Being ordered to destroy the towers with remote explosives, then return to their normal lives with only themselves to know they’d helped kill thousands of people.
Are you breaking your advice to not use contemporary politics in examples?
I doubt it’s going to be very controversial that the 9/11 attacks were morally bad. (Though it might be interesting if someone is bold enough and contrarian enough to argue that they were justified?)
Good posts. This series is the first thing in a while to make me really glad to participate here.
I think that the stereotype of Alabama bars is pretty reliable. OTOH, the stereotype of suicide bombers is much much less so. If you read the rhetoric of radical Islam, or for that matter if you read ancient mythology such as Homer or the Egyptian Book of the Dead, you will see people who are occupying a VERY VERY different moral universe from us Platonized Christianized (that includes the secular children of “modern orthodox” Jews) post-Enlightenment Westerners.
In terms of realistic psychology fitting neither the SSSM nor the Evolutionary Psychology brand (which you really should spend more times reading non-leftist criticisms of), the Muslims who flew planes into the World Trade Center undoubtedly saw themselves as heros, but in some sense that we would have a VERY hard time empathizing with or relating to. They are NOT a mirror reflection of ourselves, but genuinely something that has to be understood with empiricism, not empathy and wishful thinking.
I like your information, but I disagree with your conclusion. I don’t think it is beyond the reach of empathy to understand them as thinking of themselves as heros. Steven_Bukal and TuviaDulin make very persuasive arguments, above. Years later, I admit, but think I remember detecting some empathy for the bombers at the time. Because I was looking for it.
Good point, Tarleton—although I’m still hard-pressed to think of a better example that isn’t directly a religious belief. If you only use the obvious religious examples, people will fall into the standard trap of thinking they’ve achieved perfection as rationalists because they’re not religious—I wanted to use something that would actually strike a sympathetic chord and let people see how the belief-as-attire effect extends beyond religion. Got a better suggestion?
Vassar, also a good point, although I’m skeptical that I would have difficulty empathizing—these are humans we’re talking about, not aliens, and the WTC hijackers were mostly educated Saudi Arabians, not Yamomano. They saw themselves as heroes in the support of causes, such as sexual decency = woman’s de-emancipation, which are not American causes; they believed and maybe even anticipated 72 virgins; they fought in guardianship of ancient perfection; they carried out the will of God revealed in perfect scripture. None of this strikes me as a significant barrier to understanding. Can you say specifically what you think presents the barrier to empathy?
You want some belief-as-attire that LW people wear? How about some of the things people say “as Singularitarians”, i.e. not because they really have thought the matter through themselves, but because it is the standard position of Singularitarians.
(It’s not always easy to distinguish, granted. You could believe in cryonics because you really have evidence to suggest that this is the best use of resources… or you could believe in cryonics because that’s “what Singularitarians believe”.)
Ignorance. I may think I understand their minds, but that does not prove that I do understand their minds.
All you know is that you have a mental model of their minds which seems credible to you. Have you tested this model, and if so, how?
All I am reasonably sure of is that they did not see their act as evil and cowardly. Doubtless the same was true of Jack the Ripper and the Boston Strangler, but that tells me nothing about the differences between them and everyone else. After all, I only think that is true of them is because it seems to be true of most people.
It is really just an assumption.
That’s really an interesting question. What about the ones who really ARE insane psychopaths? Do they think they are doing the right thing, or do they really just not care?
I’m inclined toward the latter, actually. I’ve read some of Stalin’s journals, where he says things like “What is efficient is good. What is inefficient is bad.” That sounds like he literally doesn’t understand what morality MEANS to ordinary, non-psychopathic people.
Sounds like he still needs to classify things as good and bad, though. Maybe a previous moral framework has collapsed and he’s looking for an alternative? Any idea whether he had goals beyond ‘survive’?
This might be an especially easy category of bias to identify. Just ask yourself if you feel proud that this belief associates you with some group with which you want to be associated. If so, weaken your confidence in this belief.
Up to and including my belief that the scientific method is the best approach to understanding the world?
Absolutely, yes. No question that part of the reason we believe that is in order to identify with our tribe. I personally am prepared to reduce it by three orders of magnitude, from 1-epsilon to 1-1000*epsilon.
Pseudonymous, I confess that it is only a guess, just a more plausible guess than the American one. And Jack the Ripper might well have been a monster—there are such things as sociopaths.
Robin, I’d say “recalculate your reasons” not “weaken your confidence”. You can’t literally shift a probability estimate because it makes you feel proud.
Why ever not? It feels as if a belief-forming robot probably shouldn’t. But what if pride was its label for ‘probable flaw in calculation detected’.
I should mention my old short essay, Are Beliefs Like clothes?.
Eliezer, it seems to me that you can and should shift your probability estimate because it makes you feel proud. Of course you might do even better than that by recalculating your reasons, but that approach will often not be cheap or reliable.
it would be like dressing up as a Nazi for Holloween.
“So remember kids, dressing up like Hitler in school isn’t cool.”
Eliezer, your lack of familiarity with “the other side” on the topic of terrorists is all too obvious from your crude attempt at a characterization of it. All you appear to know about it is a few platitudes. Often I get the feeling from this site that it is not so much about overcoming your own biases as it is about coming up with new excuses to dismiss views that you don’t agree with by applying the genetic fallacy over and over (e.g. “suchandsuch belief is a product of suchandsuch bias”).
Eliezer, Brilliant post, in my opinion. Clarifying and edifying. I’m looking forward to where you’re going to go with this analysis of how bias and belief operate.
Silas, My opinion: you seem invested in using “muslim terrorists” for in-group/out-group construction, and I think it’s coloring (biasing?) your analysis.
Michael, great criticism of an element of Eliezer’s post.
Hopefully_Anonymous: You seem invested in labeling people as using “muslim terrorists” for in-group/out-group construction, and I think it’s coloring (biasing?) your analysis.
(???)
Constant, this blog has warned against the genetic fallacy before. What do you think would be a good characterization of the “other side”? Eliezer’s characterization describes a large minority of Americans very well. (He’s clearly not intending it to be descriptive of everyone who thinks Islamic extremism is a serious threat, if that’s what you’re thinking.)
I think questioning the Alabama bar analogy is useful within the context of this post. Whose attire is a belief in the value of giving primacy skepticism, critical thinking, etc.? According to Eliezer’s performance in the OP, it’s not the attire of either Alabama bar patrons or “muslim terrorist suicide bombers” -and both of those may signal more generally, the losers of the American Civil War and non-white brown people. In short, I think there may be a gentrification of critical thinking: it’s reserved for an in-group, perhaps in particular northeastern anglo-saxon and ashkenazi jewish male intellectuals, or an even more narrow archetypal definition that might be reducible to zero actual people. I’m interested in the degree to which our behavior might be governed by aligning with and contesting these archetypes. Including which beliefs as attire to wear (it’s perhaps an archetype alignment for Steven Hawkings and Richard Dawkins to claim to be skeptical about religion. It would probably not be an archetype alignment for Oprah to publicly wear such belief attire, even if in fact she was a crypto-skeptic).
This post may meander a bit but I think Eliezer’s post (and some of the criticisms of it) are thought provoking and may be getting us closer to a more real world, real time model of how bias and belief is operating in the world we live in.
I know if I were in an Alabama bar, and the conversation turned to how “terrorists hate our freedoms”, I’d certainly phrase things such that they didn’t contradict what everyone in the room was yelling about.
Bonus points if I were clever enough to disagree with them in a way that seemed like I was agreeing with them.
Either way, I’d be wearing a belief I most certainly did not actually believe and did not in any way believe I believed, and I would do so entirely for my own preservation.
If you don’t transmit your disagreement, why bother expressing it? Outwardly agreeing with them would accomplish the same thing with less effort.
One reason is because dog-whistles can work: I have from time to time had the experience of expressing my opinion about a subject in a way that causes the minority who agree with me to recognize me as a potential ally without triggering reprisal from the majority who disagree with me.
Another reason is to preserve some credibility in case of a future discussion where I’m more willing to deal with the consequences of public opposition. Rather than having to say (for example) “Well, yes, I know I said policy X was a good idea, but I didn’t really mean it; I was lying then, but you should totally believe me now because I’m totally telling the truth” I can instead say (for example) “I said that policy X is an efficient way of achieving goals Y and Z, which it absolutely is. But I don’t endorse maximizing Y and Z at the cost of W, which policy X fails to address at all.”
Yet another reason is to use plausible deniability as a way of equivocating, when I’m not sure whether to come out in opposition or not. That is, I can disagree while maintaining a safe path of retreat, such that if the degree of reprisal I get for disagreeing turns out to be more than I feel like suffering, I can claim to have been misunderstood and thereby (hopefully) avert further reprisals.
That already goes by the name “politician-speak”.
Good points.
It’s being more honest with yourself and your own beliefs, though it certainly isn’t more honest with your fellow bar patrons.
If you have a thing against lying (and I do), it’s the lesser of two evils.
The inspiration was from professor Robert Thornton of Lehigh University, who came up with a creative way to write student “recommendations” that, if read literally, said quite directly that hiring this particular student was a very, very bad idea. If read figuratively, however, they sounded like glowing reviews, and indeed if you were expecting a good review you would think it were an absolutely wonderful review.
This was necessary because as a professor he was obligated to give students recommendations for their employers, but negative reviews have resulted in serious lawsuits in the past. Unwilling to compromise his morals, he got very creative with the English language instead of lying.
In that case, the reviews weren’t meant for the student to ever see, but that is often unavoidable. He certainly did hope that the student’s potential employer was capable of reading between the lines and comprehending the message.
He called his system L.I.A.R., if you want to search for it. They are pretty funny, and really do sound like positively glowing reviews until you look at exactly what they are actually saying.
Seems to exist mainly as a book: http://www.amazon.com/Lexicon-Intentionally-Ambiguous-Recommendations-L-I/dp/1402201397/
Some brief samples available at http://www.avdf.com/feb96/humour_liar.html
Dead link :(.
Archived version.
Thank you. I tried using http://archive.fo/ , but no luck.
I’ll add https://web.archive.org/ to bookmarks too.
Empathy is hard. Cultures differ. We Americans (especially secular Americans?) really don’t have a clue what it feels like to (for instance) feel an obligation to kill our daughters or our sisters in order to preserve our family honor. Some actions in the name of causes may be psychologically modular, but some really aren’t. What’s the parallel for honor killing? Pressuring one’s schizophrenic philosophy post-doc son to go to law school where he thinks he’ll be miserable for the bragging rights? Sending your kids to Hebrew School or Day Camps they hate because your parents made you do it? It just doesn’t work. Even within a culture, I have no idea what it’s like to identify with a sports team and very few people can relate to the horror that I feel at some Psychological data or philosophical ideas. You once pointed out that most of us can no longer even understand why the Psycho shower scene was once considered terrifying. I would recommend Silvia Plath’s diary for what are to me stranger attitudes than those.
Why was that? I can’t find it with a search.
Actually, I think that much of www.xkcd.com including the current one can be though of as an enumeration of feelings that people who aren’t either quite young or quite nerdy have not analogues to. Philosophy is full of others, such as existential despair and satori.
xkcd’s archive page doesn’t include dates. According to Archive.org, the current comic as of 05:58:11AM (whatever time zone) on 05 August 2007 (a Sunday) was Tesla Coil, so I’m guessing you were talking about the previous one, Lisp Cycles.
“Eliezer’s characterization describes a large minority of Americans very well.”
All I see there are familiar platitudes, not a description of anybody who thinks about things. All I see, in fact, are familiar formulas employed by politicians. Nor are the formulas necessarily wrong. It should not be hard to see what is cowardly about most terrorist attacks.
American Heritage has a fairly good definition of cowardice: “Ignoble fear in the face of danger or pain.”
The ignobility is an important factor which other dictionaries tend to miss. But American Heritage misses something that Cambridge has: “a person who is too eager to avoid danger, difficulty or pain”
It does not have to be danger and pain, it can be difficulty. So in a nutshell, a coward is someone who commits a discreditable act in order to avoid a difficulty (which might be pain or danger but might be something else). In the case of terrorists, the discreditable act is an attack on civilians, and the difficulty thereby avoided is the difficulty of engaging the enemy’s armed forces. Similarly, it is cowardly to break certain Geneva conventions, for example disguising yourself as, and mixing with, civilians, to thereby shield yourself from the enemy, is cowardly, because you are committing a discreditable act (using civilians as shields) in order to avoid difficulty (greater exposure to enemy fire).
I will quote an old essay on the topic and answer some key points.
http://www.slate.com/id/1008268/
“Perhaps the idea is that it is cowardly to make a sneak attack, especially on a defenseless civilian target, rather than confront an armed enemy face to face. But no one seriously expects Osama Bin Laden to invite the 101st Airborne to fight his terrorist organization on equal terms.”
The first sentence is a fair summary of the point I just made, but the second sentence is no answer. Compare the above with the following:
“Perhaps the idea is that rape is forcible sexual intercourse. But no one seriously expects Ugly Albert to get sex any other way than by forcing the girl.”
The fact that the only possible way to succeed is discreditable or illegal or immoral, is no answer to the point that it is nevertheless discreditable or illegal or immoral. It is still what it is, even if it is the only way. If the only way to make a mark is a cowardly way, that makes it no less cowardly.
“And besides, the reason we usually consider it cowardly to make a sneak attack is because the attacker avoids facing the consequences.”
Not necessarily. As the Chambers dictionary correctly recognized, what is necessary is an avoidance of a difficulty. It does not have to be specifically “facing the consequences”.
So it should be fairly easy to see that it is not incorrect to say that the terrorists are cowards. It is furthermore, then, not incorrect to say that if someone says the terrorists are not cowards, then he is wrong.
But backing up, even though I have defended the familiar platitude that terrorist attacks are cowardly, nevertheless I do not think this accurately reflects man in the street thinking on the topic. Rather, it represents an old political formula that has caught on and that hardly makes a ripple. It’s about as meaningful as saying “good morning”. It is not significant to say it; it would be significant to stop saying it. Same as “good morning.” We say that in order to avoid doing anything significant. It is a distant cousin of the “dead metaphor”—a metaphor that has lost its force through overuse. But like the dead metaphor, its overuse does not mean that it is not valid.
Setting this aside, there is also the matter of the habit that some intellectuals have of shocking the bourgeoisie. If you say that the bad guys think that they’re the good guys and we’re the bad guys, you probably won’t raise any eyebrows. But if you make the statement in a way that implies that you agree with the bad guys’ assessment, or that you are positioning yourself as a neutral party who favors neither side, then you will probably raise some eyebrows. And based on my own experience, an awful lot of people like to present the rather familiar and tired and unremarkable view that the bad guys think that they’re the good guys, in just such a way, so as to maximize their effect on their listener. This seeming undercurrent of support for the enemy is something that can be easily avoided without changing the factual content of what you’re saying, but it is in my experience often not avoided, indeed, it seems to be sought out and nurtured. And then, when the predictable reaction occurs, like clockwork Mr. Epater-les-bourgeois loudly complains about the impossibility of making obviously true statements in front of the the foolish masses.
I agree with your point about “difficulty of engaging the enemy’s armed forces”. But I still understand the frustrations of suicide bombers, because of the difficulty of significantly or meaningfully engaging some enemy’s armed forces. Especially if you respect warriors, but not their guidance.
What is the brave action to take in that case? Simply suicide, and not suicide-attacks? Or better-targeted suicide-attacks? I am befuddled.
I am far more comfortable condemning suicide-attacks as irrational than cowardly.
Michael, I think your example is interestingly rooted in an implied in-group/out-group construction that construction Americans in a flattering way. Consider that you contrast honor killings with forcing kids to go to law school or day camp -that won’t necessarily result in their death. It’s a flattering contrast that I think constructs America as Western and honor killers as culturally Middle Eastern. But, if we contrasts cultures that approve of state-sanctioned killing of people for moral transgressions, America and the nations of the honor-killers are now in the same group, with Western Europe (and much of the rest of the world) in the other group. Incidentally, I’m not opposed to state-sanctioned killing, but I think it would be more rational for the penalty to start with doing it to to individuals to the extent it will prevent future great economic waste/increase in existential risk, rather than to punish premeditated murder of a small number of people or purported extramarital/premarital sex.
HA: I chose my examples carefully to to try to match as closely as possible as many of the categories, relationship types, motivations, etc as I could, and the examples I came up with are both pervasively American and truly ugly from my perspective in the closest way that I could think of (matching type of motive, e.g. content of emotional state, not degree of emotional state or degree of ugliness) to honor killings. My point was that we don’t have any very close matches. Your examples still don’t match the intensity of honor killings, but more importantly, the emotional quality is utterly utterly different. Anger, retaliation, maintenance of public order, prevention of repetition, deterrence, The motivations for execution are simple and easy to understand, as is the balance calculation which compares the costs and benefits implicitly, even if a more careful calculation would disagree. At the most visceral level, honor killings are not in retaliation for some harm nor are they motivated by preventing a harm. This article is probably worth looking at for everyone who is still reading this thread, by the way.
http://dangerousintersection.org/?p=1445
By the way Honor killings != state sanctioned killing.
Michael, how about the point that you’re (rather explicitly now) picking a point upon which to manufacture in-groups and out-groups. In-group: those of us who get motivations for execution. Out-group: those who get honor killings.
The in-groups and out-groups change if the point to get is abrahamic monotheism, or if the point to get is state-sanctioned punitive killings. It seems to me that you’re picking one that’s particularly salient either to you or to what you imagine your audience to be. I think this gets to the belief as attire/beliefs as cheers for teams. It’s an attempt to pick teams, but I think the implied in-groups and out-groups are at least in theory contestible.
A bit tangentially, I think teams themselves can be an effective (the most effective?) way to construct hierarchical privilege. The people on the field vs. the people on the bench (or the people regulated to the audience) of the two teams.
In terms of overcoming bias, I think understanding and when necessary countering these phenomena is important primarily to the degree that they warp decision-making or increase economic waste/existential risk.
HA: It seems to me that you think I have changed the topic. I agree with all of the sentences of your most recent comment, except for the first, but they don’t seem to be about what I was saying.
Likewise, I agreed with Eliezer’s post, but I thought that his analogy was, well, lacking in appreciation of the difficulties involved in analogy.
Basically, I think that Douglas Hofstadter’s writings on the difficulties of natural language translation, the proper translation of literature, etc, are all in relevant to the issue of the translation of inferred emotion. Evolution provides a scaffold for our brains to develop, biasing us, strongly or weakly in certain directions, http://compbiol.plosjournals.org/perlserv/?request=get-document&doi=10.1371%2Fjournal.pcbi.0030147&ct=1&SESSID=08a65728475c8bb59fcfb7e7aff1cfde sometimes strongly enough that translation is at least almost always possible (maybe not with the Piraha?) and we can speak of human universals, and sometimes weakly enough that we can talk about Liberal insensitivity to 3 of Haidt’s 5 domains of morality. When the biases are strong or when atypical emotional mixes emerge and propagate memetically, empathy ceases to be useful and “the other” must, in some respect or another, be understood empirically, e.g. without the benefit of our specialized social processing capabilities. In such cases, our anticipation suffers, but it suffers even more if we force bad analogies and continue to use our social processing capabilities in inappropriate circumstances.
All curiosity exists to destroy itself; there is no curiosity that does not want an answer.
Vassar, it seems important to you that you not be able to understand certain acts—a badge of pride. I don’t think I’m having trouble understanding an honor-killing. Someone else rapes your sister, it stains the family honor, she has to die, QED. It’s not the way I think, but that doesn’t stop me from modeling it.
In proof of this, I ask you, what virtuous mode of thought, or even mode of thought that you are not particularly indignant at, do you think yourself unable to understand across cultures?
Eliezer, I have thought of another sort of belief that is not an anticipation-controller. Sometimes, I hear quite smart young people (who don’t just wear beliefs as attire) profess to a belief in physicalism about qualia, or in libertarianism, or in the virtues of the scientific method, or in anti-pseudoscience (a la Martin Gardner), or in global-warming skepticism (a la Bjorn Lomborg), or in consequentialist egoism, or some similar broad philosophical or political doctrine. When I talk to these people, I find out that they can give a number of good arguments for why someone should follow their position, but that they have little to say in response to arguments for why people should follow alternative positions. For example, they might be able to clearly state various arguments for libertarianism, and to respond well to counters to those arguments. Yet when I tell them various arguments in favor of alternative positions (e.g. democratic socialism), their attempted rebuttals are much weaker in quality than their positive arguments for the position they claim to hold.
This usually occurs because these people have read good books or articles advocating physicalism, libertarianism, and egoism, and have learned (and been convinced by) the arguments contained therein. After reading these books, these people want to talk to others about what they’ve read and show that they can understand and reconstruct difficult arguments. They could just say to their acquaintances something like “I’ve read this book and I’d like to discuss some of the arguments in it with you”. But, for various reasons, they often instigate such conversations by saying: “I don’t think there should be any income tax at all. nor any other taxes. I’m a libertarian.” From here, a heated discussion may ensue about the merits of libertarianism, in which the neophyte can relate all his carefully reconstructed arguments to his audience. This allows the new libertarian to look clever (since he can relate good arguments) and well-read (since he can quote Nozick’s views on politics). It also provides the libertarian with practice in thinking and arguing on the spot, and in articulating difficult ideas.
I don’t count this as belief as anticipation-controller. [I’ll leave aside questions about whether beliefs in political or ethical doctrines can have empirically testable consequences. The point I’m making works just as well with global-warming skepticism as it does with libertarianism or dualism.] The person who calls himself a libertarian or a global warming skeptic after reading a couple of books and a few articles arguing for libertarianism or global-warming skepticism will often acknowledge (if honest) that if he’d started by reading books advocating alternative views, then he would not have come to be a libertarian or global-warming skeptic. He knows that he hasn’t made an attempt to hear views from both sides of contentious issues, despite their being very smart and thoughtful people advocating opposing.
Yet this lack of balance in his reading is not a problem for him. He is not actually going to act on his belief in any serious way. He’s not about to give his money and time to support global-warming skepticism or libertarianism. He would probably not bet money on these doctrines being true, unless the bet was a small enough fraction of this wealth that it would be worthwhile to garner more attention. When he says “I’m a libertarian”, what he means is “I can articulate a number of good arguments for libertarianism, along with replies to common objections to these arguments”.
I think that some professional philosophers hold beliefs in a similar way. The philosopher might come up with some clever arguments for position X. Instead of writing up a paper or blog post that simply relates these clever arguments, he will probably write an article or book that gives lots of arguments for position X, including his new clever arguments (e.g. on anti-dualism). In spending lots of time studying all the good arguments for X (and in refining his own clever arguments), he will end up with an impressive ability to make arguments in support of X. Yet he probably won’t have given the same open-minded and lengthy study to arguments in support of alternatives to X (i.e. not-X). Hence, when he says to people that he is an Xist or that he believes in X, what he means is “I can give lots of really sophisticated arguments for X”.
(This is not true of all philosophers, as some seem to strive to criticize their own positions. Also, it is not just philosophers that are guilty of this. Some scientists will spend their lives doing experiments that provide evidence for some view X, and they won’t have invested as much time in learning about the experiments and arguments of people who have been trying to show that not-X.)
Bob, I take it you’re not the deceased kiwi atmospheric scientist Robert “Bob” Unwin. But very high quality commentary. I hope that you start an blog to consolidate your observations under this name/pseudonym (as I have done with mine).
Bob: Great post.
Eliezer: I was not saying anything cannot be understood, but rather that using our specialized “empathic” capabilities for understanding human behavior in terms of our own hypothetical behavior is counterproductive to understanding many instances of human behavior when the humans in question are from different cultures or otherwise very different from one’s self. It’s easy to model it, possibly even to model it well (Chronicles of a Death Foretold by Gabriel Garcia Marquez tries to), but next to useless to model it by reference to your own feelings. If we couldn’t model it at least somewhat we couldn’t even form the concept and talk about it, but we don’t have the special advantages here that we have when modeling a hungry person eating or the like.
For a less politically charged example, fairly young (7 or 8, maybe 9?) could learn to model sexual desire, but they can’t empathize. Adults cannot empathize with a child’s enjoyment of TV shows targeting young children even though they have been children and may remember watching the same shows and enjoying them. Actually, since enjoyment is virtuous, that example answers your question, though across ages rather than across cultures. For ‘virtuous’, cross cultural, and impossible to empathically model (ignoring that to some degree there is a contradiction here, as without the ability to model the state it is hard to be confident of its intrinsic virtuousness, only of the virtuousness of the actions it brings about, which could also have been pursued for utilitarian or other deeply generally human reasons such as caring. The symmetry with abhorrent actions is broken in this respect) there are surely many different types of meditation or other altered mental states, enjoyment of a vast number of foods (I can understand liking kumiss via a “comfort food” schema, but not *kumiss AS kumiss), entertainments, and art forms, and probably more subtle and general feelings having to do with attachment to the land, etc, though I can model these empathically to some degree.
Most generally of all, I already had given examples, in citing Haidt’s 5 moral domains.
Hmm. Criticism of Haidt’s theory. Haidt, and most other people, probably see conservatives and liberals as having equal lack of understanding of one another. (the point of his theory is that he lacks such empathic understanding, hence the need of an empirically derived theory). However, his theory suggests that conservatives should easily understand liberals. The magnitude of one’s disagreement shouldn’t be the cause of empathy failure. Rather, empathy failure should follow from the apparent pointlessness of the action being criticized. For instance, it’s probably easy for us to empathize with Joseph Mengele’s actions while still strongly disapproving, as scientific curiosity is a shared motivation and the difference between his actions and actions we would approve of is because of his not applying enough weight to caring/sympathy considerations that we consider important. The opposite is true of one’s experience reading about Isaac Bashevais Singer’s father in “My Father’s Court” or other good works of anthropology dealing with rich cultures. We are bemused by the apparent pointlessness of all of the ritual details that this silly man takes so very seriously, but he is harmless and we are not indignant at all.
Bob, a very high quality comment, but at 800 words it is too long for a comment. Please everyone, let’s try to keep comments under about 400 words—longer items should be their own post or essay somewhere, which you can of course link to in a comment.
Re: the Alabama bar, when that same criticism was leveled by Neil Young, the response was, “A Southern man don’t need him around anyhow”. Apart from the fact that it came in the form of hit song, the reply is notable in that it’s not something along the lines of, “them’s fightin’ words!” Though you may be right about the South’s religious and political attitudes, I think you misunderstand how and when violence is used in that culture.
Anyway, back to the issue. The mindset of Mohamet Atta, et al, was elegantly described by Eric Hoffer in The True Believer. I don’t believe it takes any unusal emotional insight to understand Atta’s psychology, if it’s seen in those terms.
Konrad: Not to repeat myself yet again, but no, understanding psychology never requires unusual emotional insight. It takes analytical ability, but it gives a different type of understanding from that which emotional insight gives.
Bob wrote “The person who calls himself a global warming skeptic… after reading a couple of books and a few articles arguing for [such skepticism] will often acknowledge that if he’d started by reading books advocating alternative views, then he would not have come to be a global warming skeptic...” This is one mechanism, but sometimes positions just “feel right” to people, i.e. in agreement with their predisposed visions, or traits.
Also it seemed to me that by asking of people that they examine as many arguments opposed to their view as they examine in alignment with their view, you would also be demanding a similar objectivity from scientists. But as has been said often, scientists are only human. They pursue their hunches (conjectures); and natural selection knew what it was doing when it made all of us normally tend to do the same.
This is not to strongly discount a goal of overcoming bias, but is to confirm a point doubtlessly made here before, that not only does bias exist for a reason but can in many instances be optimal for achievement or survival. Admittedly, truth seeking and achievement may be at odds with one another at times.
What would falsify that model of belief as attire?
I am in the process of working through these delicious posts so apologies in advance if my comments are redundant.
Perhaps group membership of a mutually supportive tribe has the greatest value (for example from both a psychological and survival perspective). If this is the goal, what is the most rational course of action? Will a rational person inevitably run into problems where the tool they are using to solve their problems becomes their primary source of problems?
I like this site for the very reason that it represents a community where my natural problem solving inclinations are not compromising my sense of being similar to those I interact with. But as with all communities I step with trepidation for fear of violating a social taboo which may be rationalised but is not reasonable (belief as attire). If we choose to be irrational because rationally we have decided it is the most rational course of action are we still rational?
Can we truly choose to be irrational, though? Recognizing the irrationality of a belief, and valuing reason, the most we can do is act as if we hold others’ irrational beliefs. I’m sure there are many people who have done this throughout time; the tragedy is that each of these people may have “come out” as nonbelievers if they were aware of the others’ presence.
While I personally think that a person compromises his integrity when he acts contrary to his beliefs, there are certainly many instances in which this course of action has survival value, and so can be said to be rational.
Using drugs, we could probably make ourselves less rational on purpose. Drinking lots of alcohol, for instance.
I appreciate the effort to sort out “improper beliefs”. As a philosopher with a background in distinguishing surface-level propositions from speech acts with goals that may be masked by those propositions as such, I am inclined simply to say that “improper beliefs” are NOT beliefs. I prefer reserving “belief” for the anticipatory dispositional beliefs that you call “proper”.
This is so far just a semantic difference, but the real difference comes out when you say that people have to “convince themselves they are passionate”. From my perspective, no such “convincing” is necessary when a person moves from literal to nonliteral interpretations of mythic language, because the esoteric perspective can be as exciting and full of significance as the exoteric. People can be passionate about the real, positive benefits of religious practices: psychological well-being, social connectedness, aesthetic sensibility, self-respect, etc. Discovering these benefits as the real meaning of myths can be as eye-opening as the adoption of a counterfactual, mythic perspective.
But most people clearly DON’T treat these things as meaningless speech acts.
How do I know this? Because if you say something like “Right, because that’s just a meaningless speech act” in response to some absurdity of religion like “virgin birth” or “transsubstantiation”, people will get VERY ANGRY at you. They will not respond as though they are playing a game of words, they will respond as though you have accused them of lying. And if improper beliefs are precisely non-beliefs trying to make themselves look like beliefs, then you HAVE just accused them of lying.
The only way this comment makes sense to me is if I assume that you believe that (for example) humans reliably fail to become angry when their tribal attire is challenged, unless that tribal attire also happens to be a meaningful belief.
Do you in fact believe that?
If so, can you expand on your reasons for believing that? It seems implausible to me, and inconsistent with my observations of human behavior.
The only way this comment makes sense to me is if it was written without reference to its grandparent.
In all fairness, I think Islamic fundamentalists really do hate our freedom. They hate our entire way of life, and this freedom is a part of that.
Hating the freedoms of western society doesn’t preclude one from committing brave, selfless acts, though. Unfortunately for us.
To paraphrase, there’s a difference between resenting someone for having freedoms that you do not, and disliking the concept of “freedom”. And these get mixed up on occasion.
Clearly they DO hate our concepts “freedom of religion” and “freedom of speech”. (They will explicitly say so!) There may be some freedoms that they would value… though actually maybe not. Maybe they value deference to Islam so highly that any kind of individual freedom would entail the freedom to violate Islam and therefore be evil.
The biases of Rationalists are showing in this article.
It’s peculiar to have a sequence on Korzybski’s “The Map is Not the Territory” followed shortly thereafter by a post making a purely intensional distinction between “proper” and “improper” beliefs.
What’s “improper” about achieving in group identification? It’s often quite handy. Casting intensional aspersions on all values we might derive from our beliefs but the predictive utility does not strike me as rational.
I think that the word improper is being used in the post in the same way that mathematicians use it in the phrase improper integral.
This usage is not pejorative, but marks a delicate extension of the basic concept. In mathematics the delicacy arises from the need to take a limit, which might not exist. In the case of an improper belief the believer is opened up to conflicts, perhaps because they belong to multiple groups with conflicting identities, perhaps because predictive utility competes with group identification in practical importance.
But this is one of my issues with what I have seen at lesswrong—the privileging of predictive utility over other forms of epistemic rationality over instrumental rationality. Epistemic rationality is another form of instrumental rationality, but where rationalists gather, it gets privileged as if it were the only true rationality, or at least a better rationality. It’s a mistake, and really impairs the ability of rationalists to understand other people who do not privilege epsitemic rationality to the same degree, if at all.
You say improper is not used in a pejorative sense, but clearly the normal usage of “improper” is pejorative. And when an epistemic utitlity competes with another instrumental utility, why doesn’t that equally make the epistemic utility improper?
Further, the non-epistemic beliefs are described as
TIme and time again, epistemic rationality is set up as the real, better, higher, truer, shinier rationality.
Just to be clear, I’m not here to trash the idea here. I came to the site from reading EY’s Harry Potter fan fiction, which is just awesome and I’ve dying for the next chapter. Between the book, and the sequences, I’m busy reading a guy making all my arguments and more, reading many of the key books I read years ago in graduate school. Korzybski and Jaynes are at the top of my pantheon (with Stirner, who I don’t see a lot of influence from). So I’m here because of some very specific and fundamental shared methodology.
I don’t say “me too” to all that I agree with, unless it is something new to me or I have a refinement to add. But on this point, I see privileging of epistemic rationality, and I think it’s a mistake.
You would put instrumental rationality above epistemic rationality?
So if it makes me happy to believe the Moon is made of cheese, I ought to do so?
If making yourself happy is, all things considered, what you want to do. (And then assuming that said belief modification is the most effective way to gain happiness.)
I put winning above predictive accuracy, yes.
As fate would have it, the article What do We Mean by Rationality is the page that comes up in my chrome browser when I type “less” http://lesswrong.com/lw/31/what_do_we_mean_by_rationality/
It’s a peculiar article, because it gives two concepts as a definition for rationality, Epistemic Rationality and Instrumental Rationality, where clearly the concepts are not identical. And yet all sorts of statements are made thereafter about Rationality without noting the difference between the two concepts.
To answer you question in these terms, for all beliefs where the Instrumentally Rational belief is X, and the Epistemically Rational belief is NOT X, I’d rather believe in X. I’d rather Win than Correctly Predict, where I have to make the tradeoff.
Their belief or cowardly is not the problem. We must be concerned about their expected behavior. The rest is a commentary.
I read this three times. First pass: What? Why? Maybe I missed something. rereads Second pass: Oh, would they not get the reference? But why would that be so bad? rereads Third pass: It’s certainly plausible that it’s severely overstating it to say they think of us as hideous alien monsters; I can think of other religious feelings that could lead to that level of bravery and dedication, so such a statement might make me seem insensitive and horribly ignorant. But I’m pretty sure random people in an Alabama bar wouldn’t recognize that, so I’m still confused. reads on
Well. That’s not good news for me is it?
Eliezer’s probably saying that the patrons of said Alabama bar would be, shall we say, highly unlikely to appreciate the neutral point of view, probably due to ingroup biases. It’s the arguments-as-soldiers thing again, and you’re implicitly putting yourself on the wrong side.
I’ve never been to Alabama myself, so I don’t know whether this is actually true or not. I suspect it wouldn’t be as bad as he’s implying (it might start an argument, but I wouldn’t expect a fight), but that might be my optimism acting up.
Yes, I understood as soon as I read the next sentence. I just felt silly that I couldn’t figure it out myself.
Maybe it’s just because I’m a New Yorker, but trust me that you don’t have to cross the Mason-Dixon line for people to be willing to sock someone who said something even remotely positive about the 9/11 hijackers. Things have cooled down a bit in the last twelve years, but there are still some things you just don’t say. Or imply, in this case.
I know that I would personally have trouble restraining myself if someone expressed actual support for, or tried to equivocate-away, the crimes of terrorists in my presence. It’s absolutely an issue of tribal loyalty, and not even entirely irrational; expressing empathy for an enemy weakens your resolve against them, which is not a particularly wise choice when the only way our tribe can lose is by giving up.
I see a mind being killed.
We’re discussing people’s emotional reactions to these types of statements and why they feel those emotions.
I pointed out that those reactions are typically strong and negative (and not just in Alabama), and that holding them is instrumentally rational.
Since this isn’t preventing me from updating on any evidence presented (I absorbed the “everyone is the hero of their life story” moral years and years ago), I don’t see that I’m particularly mind-dead in this scenario.
I saw mind killing in the particular phrase:
I also have doubts about that instrumental rationality.
My reasoning is… well it’s hard to explain without going 100% RL politics, which is as rude as it is counterproductive. Basically there’s different schools of thought on the strategy involved in asymmetrical warfare and I tend to come down on a particularly unpopular and effective side of the debate. That’s all I’m willing to say in public.
In terms of instrumental rationality, it’s pretty simple; being part of the winning team is generally useful, cheering and wearing the colors shows people you’re on the team, and you cheer a lot more enthusiastically when you actually believe it. Cognitive dissonance gets a bad rap, but it really is a lot easier to compartmentalize than to maintain a lie long-term.
True. However cheering for your team while dehumanizing your opponents is often a poor way to make your team stronger in the long run. Labeling someone a terrorist diminishes your desire to understand their motivations and eventually mitigate further terrorism. Instead one ends up supporting Iraq war-style mission creep resulting in the needless deaths of those on your team.
“One thing is for certain: there is no stopping them; the ants will soon be here. And I for one welcome our new insect overlords. I’d like to remind them that as a trusted TV personality, I can be helpful in rounding up others to toil in their underground sugar caves.”
It seems pretty obvious to me that your tribe can also lose by directing its energy in the wrong direction, resulting in harms to yourselves. As, for example, has already happened with TSA, so I hear. (This doesn’t mean “the terrorists have won” but it does mean you have lost.)
I’ve never been to Alabama, but as I understand it the cultural climate in Alabama shares certain key characteristics with that in rural Massachusetts.
Were I, in a rural Massachusetts bar, to make any public statement to the effect that the individuals who flew planes into the WTC could plausibly be seen as heroes, or that they were comparable in any way to American soldiers fighting and dying for American interests (1), I would expect the locals to view this as a challenge to sacred virtues and to react accordingly.
I would not expect this to necessarily cause a fight (though it depends on how I went about it, and whether and how I backed down when those virtues were upheld by those around me); it wouldn’t even necessarily get me asked to leave (though that’s more likely, especially if I continued to defend that position).
(1) Edit: on further thought, I suspect that just talking about U.S. soldiers fighting for “American interests” (as opposed to “American values” or “America” or some such thing) would raise a suspicious eyebrow or two, as it superficially pattern-matches to a particular mid-1900s stereotypical formulation of Communist propaganda.
From the perspective of an intoxicated rural Southern conservative, the WTC terrorists are literally hideous alien monsters, in that they acted without humanity, they’re from somewhere else, and, being Arab, they look unpleasant (racism is a significant enhancing factor in hating people you already don’t like very much). For you to empathise with those terrorists would be a direct threat to this point of view. It would clearly mark you as not being one of them, and of not sharing their values.
Sayyid Qutb, who was a supplier of ideology for the terrorists who perpetrated 9/11, did see the US as evil for, among other things, the freedom of its women. (E.g., to quote Qutb via Wiki, he noted the “animal-like” mixing of the sexes (which “went on even in churches”)). So “hate our freedom” has truth to it.
I feel called out, I did this at a funeral 3 months ago lol they stared at me like “wtf?” …ah in hindsight I could have controlled that impulse to specify that, now they would be brewing some easily avoidable stereotypes which would hinder my quality of life.