Cultish Countercultishness
In the modern world, joining a cult is probably one of the worse things that can happen to you. The best-case scenario is that you’ll end up in a group of sincere but deluded people, making an honest mistake but otherwise well-behaved, and you’ll spend a lot of time and money but end up with nothing to show. Actually, that could describe any failed Silicon Valley startup. Which is supposed to be a hell of a harrowing experience, come to think. So yes, very scary.
Real cults are vastly worse. “Love bombing” as a recruitment technique, targeted at people going through a personal crisis. Sleep deprivation. Induced fatigue from hard labor. Distant communes to isolate the recruit from friends and family. Daily meetings to confess impure thoughts. It’s not unusual for cults to take all the recruit’s money—life savings plus weekly paycheck—forcing them to depend on the cult for food and clothing. Starvation as a punishment for disobedience. Serious brainwashing and serious harm.
With all that taken into account, I should probably sympathize more with people who are terribly nervous, embarking on some odd-seeming endeavor, that they might be joining a cult. It should not grate on my nerves. Which it does.
Point one: “Cults” and “non-cults” aren’t separated natural kinds like dogs and cats. If you look at any list of cult characteristics, you’ll see items that could easily describe political parties and corporations—“group members encouraged to distrust outside criticism as having hidden motives,” “hierarchical authoritative structure.” I’ve written on group failure modes like group polarization, happy death spirals, uncriticality, and evaporative cooling, all of which seem to feed on each other. When these failures swirl together and meet, they combine to form a Super-Failure stupider than any of the parts, like Voltron. But this is not a cult essence; it is a cult attractor.
Dogs are born with dog DNA, and cats are born with cat DNA. In the current world, there is no in-between. (Even with genetic manipulation, it wouldn’t be as simple as creating an organism with half dog genes and half cat genes.) It’s not like there’s a mutually reinforcing set of dog-characteristics, which an individual cat can wander halfway into and become a semidog.
The human mind, as it thinks about categories, seems to prefer essences to attractors. The one wishes to say, “It is a cult,” or, “It is not a cult,” and then the task of classification is over and done. If you observe that Socrates has ten fingers, wears clothes, and speaks fluent Greek, then you can say, “Socrates is human,” and from there deduce, “Socrates is vulnerable to hemlock,” without doing specific blood tests to confirm his mortality. You have decided Socrates’s humanness once and for all.
But if you observe that a certain group of people seems to exhibit ingroup-outgroup polarization and see a positive halo effect around their Favorite Thing Ever—which could be Objectivism, or vegetarianism, or neural networks—you cannot, from the evidence gathered so far, deduce whether they have achieved uncriticality. You cannot deduce whether their main idea is true, or false, or genuinely useful but not quite as useful as they think. From the information gathered so far, you cannot deduce whether they are otherwise polite, or if they will lure you into isolation and deprive you of sleep and food. The characteristics of cultness are not all present or all absent.
If you look at online arguments over “X is a cult,” “X is not a cult,” then one side goes through an online list of cult characteristics and finds one that applies and says, “Therefore it is a cult!” And the defender finds a characteristic that does not apply and says, “Therefore it is not a cult!”
You cannot build up an accurate picture of a group’s reasoning dynamic using this kind of essentialism. You’ve got to pay attention to individual characteristics individually.
Furthermore, reversed stupidity is not intelligence. If you’re interested in the central idea, not just the implementation group, then smart ideas can have stupid followers. Lots of New Agers talk about “quantum physics,” but this is no strike against quantum physics.1 Along with binary essentialism goes the idea that if you infer that a group is a “cult,” therefore their beliefs must be false, because false beliefs are characteristic of cults, just like cats have fur. If you’re interested in the idea, then look at the idea, not the people. Cultishness is a characteristic of groups more than hypotheses.
The second error is that when people nervously ask, “This isn’t a cult, is it?” it sounds to me like they’re seeking reassurance of rationality. The notion of a rationalist not getting too attached to their self-image as a rationalist deserves its own essay.2 But even without going into detail, surely one can see that nervously seeking reassurance is not the best frame of mind in which to evaluate questions of rationality. You will not be genuinely curious or think of ways to fulfill your doubts. Instead, you’ll find some online source which says that cults use sleep deprivation to control people, you’ll notice that Your-Favorite-Group doesn’t use sleep deprivation, and you’ll conclude, “It’s not a cult. Whew!” If it doesn’t have fur, it must not be a cat. Very reassuring.
But every cause wants to be a cult, whether the cause itself is wise or foolish. The ingroup-outgroup dichotomy, etc., are part of human nature, not a special curse of mutants. Rationality is the exception, not the rule. You have to put forth a constant effort to maintain rationality against the natural slide into entropy. If you decide, “It’s not a cult!” and sigh with relief, then you will not put forth a continuing effort to push back ordinary tendencies toward cultishness. You’ll decide the cult-essence is absent, and stop pumping against the entropy of the cult attractor.
If you are terribly nervous about cultishness, then you will want to deny any hint of any characteristic that resembles a cult. But any group with a goal seen in a positive light is at risk for the halo effect, and will have to pump against entropy to avoid an affective death spiral. This is true even for ordinary institutions like political parties—people who think that “liberal values” or “conservative values” can cure cancer, etc. It is true for Silicon Valley startups, both failed and successful. It is true of Mac users and of Linux users. The halo effect doesn’t become okay just because everyone does it; if everyone walks off a cliff, you wouldn’t too. The error in reasoning is to be fought, not tolerated. But if you’re too nervous about, “Are you sure this isn’t a cult?” then you will be reluctant to see any sign of cultishness, because that would imply you’re in a cult, and It’s not a cult!! So you won’t see the current battlefields where the ordinary tendencies toward cultishness are creeping forward, or being pushed back.
The third mistake in nervously asking, “This isn’t a cult, is it?” is that, I strongly suspect, the nervousness is there for entirely the wrong reasons.
Why is it that groups which praise their Happy Thing to the stars, encourage members to donate all their money and work in voluntary servitude, and run private compounds in which members are kept tightly secluded, are called “religions” rather than “cults” once they’ve been around for a few hundred years?
Why is it that most of the people who nervously ask of cryonics, “This isn’t a cult, is it?” would not be equally nervous about attending a Republican or Democratic political rally? Ingroup-outgroup dichotomies and happy death spirals can happen in political discussion, in mainstream religions, in sports fandom. If the nervousness came from fear of rationality errors, people would ask, “This isn’t an ingroup-outgroup dichotomy, is it?” about Democratic or Republican political rallies, in just the same fearful tones.
There’s a legitimate reason to be less fearful of Libertarianism than of a flying-saucer cult, because Libertarians don’t have a reputation for employing sleep deprivation to convert people. But cryonicists don’t have a reputation for using sleep deprivation, either. So why be any more worried about having your head frozen after you stop breathing?
I suspect that the nervousness is not the fear of believing falsely, or the fear of physical harm. It is the fear of lonely dissent. The nervous feeling that subjects get in Asch’s conformity experiment, when all the other subjects (actually confederates) say one after another that line C is the same size as line X, and it looks to the subject like line B is the same size as line X. The fear of leaving the pack.
That’s why groups whose beliefs have been around long enough to seem “normal” don’t inspire the same nervousness as “cults,” though some mainstream religions may also take all your money and send you to a monastery. It’s why groups like political parties, that are strongly liable for rationality errors, don’t inspire the same nervousness as “cults.” The word “cult” isn’t being used to symbolize rationality errors; it’s being used as a label for something that seems weird.
Not every change is an improvement, but every improvement is necessarily a change. That which you want to do better, you have no choice but to do differently. Common wisdom does embody a fair amount of, well, actual wisdom; yes, it makes sense to require an extra burden of proof for weirdness. But the nervousness isn’t that kind of deliberate, rational consideration. It’s the fear of believing something that will make your friends look at you really oddly. And so people ask, “This isn’t a cult, is it?” in a tone that they would never use for attending a political rally, or for putting up a gigantic Christmas display.
That’s the part that bugs me.
It’s as if, as soon as you believe anything that your ancestors did not believe, the Cult Fairy comes down from the sky and infuses you with the Essence of Cultness, and the next thing you know, you’re all wearing robes and chanting. As if “weird” beliefs are the direct cause of the problems, never mind the sleep deprivation and beatings. The harm done by cults—the Heaven’s Gate suicide and so on—just goes to show that everyone with an odd belief is crazy; the first and foremost characteristic of “cult members” is that they are Outsiders with Peculiar Ways.
Yes, socially unusual belief puts a group at risk for ingroup-outgroup thinking and evaporative cooling and other problems. But the unusualness is a risk factor, not a disease in itself. Same thing with having a goal that you think is worth accomplishing. Whether or not the belief is true, having a nice goal always puts you at risk of the happy death spiral. But that makes lofty goals a risk factor, not a disease. Some goals are genuinely worth pursuing.3
Problem four: The fear of lonely dissent is something that cults themselves exploit. Being afraid of your friends looking at you disapprovingly is exactly the effect that real cults use to convert and keep members—surrounding converts with wall-to-wall agreement among cult believers.
The fear of strange ideas, the impulse to conformity, has no doubt warned many potential victims away from flying saucer cults. When you’re out, it keeps you out. But when you’re in, it keeps you in. Conformity just glues you to wherever you are, whether that’s a good place or a bad place.
The one wishes there was some way they could be sure that they weren’t in a “cult.” Some definite, crushing rejoinder to people who looked at them funny. Some way they could know once and for all that they were doing the right thing, without these constant doubts. I believe that’s called “need for closure.” And—of course—cults exploit that, too.
Hence the phrase “cultish countercultishness.”
Living with doubt is not a virtue—the purpose of every doubt is to annihilate itself in success or failure, and a doubt that just hangs around accomplishes nothing. But sometimes a doubt does take a while to annihilate itself. Living with a stack of currently unresolved doubts is an unavoidable fact of life for rationalists. Doubt shouldn’t be scary. Otherwise you’re going to have to choose between living one heck of a hunted life, or one heck of a stupid one.
If you really, genuinely can’t figure out whether a group is a “cult,” then you’ll just have to choose under conditions of uncertainty. That’s what decision theory is all about.
Problem five: Lack of strategic thinking.
I know people who are cautious around ideas like intelligence explosion and superintelligent AI, and they’re also cautious around political parties and mainstream religions. Cautious, not nervous or defensive. These people can see at a glance that singularity-ish ideas aren’t currently the nucleus of a full-blown cult with sleep deprivation, etc. But they worry that it will become a cult, because of risk factors like turning the concept of a powerful AI into a Super Happy Agent (an agent defined primarily by agreeing with any nice thing said about it). Just because something isn’t a cult now doesn’t mean it won’t become a cult in the future. Cultishness is an attractor, not an essence.
Does this kind of caution annoy me? Hell no. I spend a lot of time worrying about that scenario myself. I try to place my Go stones in advance to block movement in that direction.4
People who talk about “rationality” also have an added risk factor. Giving people advice about how to think is an inherently dangerous business. But it is a risk factor, not a disease.
Both of my favorite Causes are at-risk for cultishness. Yet somehow I get asked, “Are you sure this isn’t a cult?” a lot more often when I talk about powerful AIs than when I talk about probability theory and cognitive science. I don’t know if one risk factor is higher than the other, but I know which one sounds weirder . . .
Problem #6 with asking, “This isn’t a cult, is it?” . . .
Just the question itself places me in a very annoying sort of Catch-22. An actual Evil Guru would surely use the one’s nervousness against them, and design a plausible elaborate argument explaining Why This Is Not A Cult, and the one would be eager to accept it. Sometimes I get the impression that this is what people want me to do! Whenever I try to write about cultishness and how to avoid it, I keep feeling like I’m giving in to that flawed desire—that I am, in the end, providing people with reassurance. Even when I tell people that a constant fight against entropy is required.
It feels like I’m making myself a first dissenter in Asch’s conformity experiment, telling people, “Yes, line X really is the same as line B, it’s okay for you to say so too.” They shouldn’t need to ask! Or, even worse, it feels like I’m presenting an elaborate argument for Why This Is Not A Cult. It’s a wrong question.
Just look at the group’s reasoning processes for yourself, and decide for yourself whether it’s something you want to be part of, once you get rid of the fear of weirdness. It is your own responsibility to stop yourself from thinking cultishly, no matter which group you currently happen to be operating in.
Cults feed on groupthink, nervousness, desire for reassurance. You cannot make nervousness go away by wishing, and false self-confidence is even worse. But so long as someone needs reassurance—even reassurance about being a rationalist—that will always be a flaw in their armor. A skillful swordsman focuses on the target, rather than glancing away to see if anyone might be laughing. When you know what you’re trying to do and why, you’ll know whether you’re getting it done or not, and whether a group is helping you or hindering you.5
1Of course, stupid ideas can also have stupid followers.
2Though see the two cult koans, “Why Truth?” (in Map and Territory), and “The Twelve Virtues of Rationality” (http://www.lesswrong.com/rationality/the-twelve-virtues-of-rationality).
3On the other hand, I see no legitimate reason for sleep deprivation or threatening dissenters with beating, full stop. When a group does this, then whether you call it “cult” or “not-cult,” you have directly answered the pragmatic question of whether to join.
4Hence, for example, the series of essays on cultish failures of reasoning.
5PS: If the one comes to you and says, “Are you sure this isn’t a cult?” don’t try to explain all these concepts in one breath. You’re underestimating inferential distances. The one will say, “Aha, so you’re admitting you’re a cult!” or, “Wait, you’re saying I shouldn’t worry about joining cults?” or, “So . . . the fear of cults is cultish? That sounds awfully cultish to me.”
So the last annoyance factor—#7 if you’re keeping count—is that all of this is such a long story to explain.
- Why Our Kind Can’t Cooperate by 20 Mar 2009 8:37 UTC; 292 points) (
- Disguised Queries by 9 Feb 2008 0:05 UTC; 181 points) (
- Don’t Revere The Bearer Of Good Info by 21 Mar 2009 23:22 UTC; 126 points) (
- No Safe Defense, Not Even Science by 18 May 2008 5:19 UTC; 119 points) (
- Whining-Based Communities by 7 Apr 2009 20:31 UTC; 84 points) (
- To Spread Science, Keep It Secret by 28 Mar 2008 5:47 UTC; 82 points) (
- You’re Calling *Who* A Cult Leader? by 22 Mar 2009 6:57 UTC; 67 points) (
- Schools Proliferating Without Evidence by 15 Mar 2009 6:43 UTC; 64 points) (
- Neural Categories by 10 Feb 2008 0:33 UTC; 61 points) (
- When Anthropomorphism Became Stupid by 16 Aug 2008 23:43 UTC; 56 points) (
- Heat vs. Motion by 1 Apr 2008 3:55 UTC; 53 points) (
- 18 Mar 2011 7:22 UTC; 49 points) 's comment on Less Wrong NYC: Case Study of a Successful Rationalist Chapter by (
- 15 Apr 2012 17:19 UTC; 21 points) 's comment on Our Phyg Is Not Exclusive Enough by (
- 1 Dec 2011 3:19 UTC; 14 points) 's comment on [SEQ RERUN] Two Cult Koans by (
- 4 Aug 2018 9:57 UTC; 9 points) 's comment on Leverage Research: reviewing the basic facts by (EA Forum;
- [SEQ RERUN] Cultish Countercultishness by 9 Dec 2011 2:08 UTC; 9 points) (
- Rationality Reading Group: Part J: Death Spirals by 24 Sep 2015 2:31 UTC; 7 points) (
- 22 Jun 2012 0:47 UTC; 6 points) 's comment on The Power of Reinforcement by (
- 12 Oct 2012 9:59 UTC; 6 points) 's comment on Rationality Quotes October 2012 by (
- Rationalist Community and Ritual by 14 Nov 2023 23:39 UTC; 6 points) (
- 21 Mar 2009 20:33 UTC; 5 points) 's comment on Mind Control and Me by (
- 14 Jun 2010 21:53 UTC; 5 points) 's comment on Open Thread June 2010, Part 3 by (
- 12 May 2010 2:28 UTC; 4 points) 's comment on What is bunk? by (
- 27 Jan 2012 13:40 UTC; 4 points) 's comment on What’s going on here? by (
- 26 Sep 2021 6:11 UTC; 4 points) 's comment on Explanations as Hard to Vary Assertions by (
- 2 Jan 2012 17:01 UTC; 3 points) 's comment on An inducible group-”meditation” for use in rationality dojos by (
- 20 Dec 2011 14:16 UTC; 2 points) 's comment on Ritual Report: NYC Less Wrong Solstice Celebration by (
- Request for rough draft review: Navigating Identityspace by 29 Sep 2010 17:51 UTC; 1 point) (
- 28 Feb 2023 20:57 UTC; 1 point) 's comment on Beginning to feel like a conspiracy theorist by (
- 24 Jan 2013 1:28 UTC; 0 points) 's comment on [Link] How Signaling Ossifies Behavior by (
- 17 May 2012 19:54 UTC; 0 points) 's comment on I Stand by the Sequences by (
- 1 Aug 2011 2:02 UTC; -1 points) 's comment on On the unpopularity of cryonics: life sucks, but at least then you die by (
- 23 Oct 2011 14:06 UTC; -2 points) 's comment on [LINK] Loss of local knowledge affecting intellectual trends by (
You should ask those people what a cult is. They won’t be able to answer, and they may just realize that their question was nonsense to begin with.
Not being able to define a term to someone else’s satisfaction on the spot doesn’t mean you’re talking about nothing rather than something. (It really helps to be able to, of course, but there’s a reason it’s not a convincing argument in practice.)
Still, if they try to define what a cult is (even if they do that later, after the conversation), that alone can help them answer their underlying questions.
Maybe they’re asking so nervously because they were planning to set up a cult around the very same idea?
The Church of Frozen Heads. Come worship the meat popsicle.
This is a truly excellent post, thanks very much. My mind was leaping ahead to the last paragraph but three long before I got there.
You think it could be a cult? Put aside for a moment the question of whether or not their Big Idea is the real thing. Are they acting in a requisitely rational manner? If so, by all means put on a robe. If not, smile and back away.
To that end, does anyone have an example of some followers of an Irrational Big Idea conducting their society/group in a comparatively rational/non-cultish manner (aside from that belief)? If there’s no cult-essence that comes with Faith-In-Big-Idea, there must be a couple of notables. (And no, you get no points for any of; Singularity, Transhumanism, Life Extension or Head-Freezing!)
I know this one is very old, but it deserves an answer. Yes. Frenology. Some time ago, a bit before Bram Stoker wrote Dracula, some fisiologists noticed that regular use of specific parts of the brain leads to a change (swelling, I think) of that part. A dead pianist would have a huger part associated with manual dexterity and rithm, for example. They also believed this change in the cerebral tissue was enough to affect the skull, so that they could tell a person’s personality/habilities/preferences/etc by measuring the relative size of parts of their skulls.
And they were, as far as I know, very rational about it. Their experiments and bookkeeping are actually examples of excelence in those areas. It’s just that the base theory was rubbish.
N Rays deserve an honorable mention.
Blondlot was very scientific (in appearance), and followed by some scientists (of the same nationality).
Other good candidates today would be : Nanotech, space elevator, anything too much futurist-sounding.
Yes it’s going to happen some day, no it won’t be like we imagine.
Eli: great posts, but you are continuously abusing “the one”, “the one”, “the one”. That’s not how the word “one” is used in the way you are trying to use it. Proper usage is “one”, without “the”.
Furthermore, when the pronoun needs to be repeated, the nicer and more traditional usage is “one … one’s … to one”, and not “one … their … to them”.
See Usage Note here.
You’re such a lion against religion, I admire that. So, I’m surprised you would say that living with doubt is not a virtue. You know about incommensurability right? You know about perspectivism? There is no “view from nowhere” that can make perfect objectivity possible.
Therefore: doubt. To live with doubt makes room for learning. Lose doubt and you also lose inquiry. Some doubts are annihilated by inquiry, but as Richard Feynman said, “science is the belief in the ignorance of experts”. He said we need a well developed theory of ignorance to protect the future from our misconceptions of the present.
Doubt is difficult to live with. I’d love to say with certainty that Christianity is false. I’m constrained to saying that I have no better reason to accept Christianity than to accept the Spaghetti monster theory. The guy who came up with the Spaghetti monster did so as a parody—but maybe the Monster Himself placed the ideas in his head to spread the good word of Spaghetti.
Bayesian rationality doesn’t solve doubt, because nothing tells you how to identify the system and its factors that must be modeled. So, you’re still stuck with having to define your premises, and doubt comes in with the premises.
Doubt is like an anti-oxidant that protects against cultishness. Of course, a cult can use fake doubt to throw people off its scent.
You’re using different definitions for doubt here, and that is the issue. EY uses “doubt” in the sense of a suspicion that not enough knowledge is currently had to evaluate a specific claim, while you are using it as the opposite of “certainty” (though not consistently, somehow). In saying that doubt should not be lived with he was referencing his previously posted explanation of how these specific suspicions by nature are meant to annihilate themselves. Either you find the evidence you thought was missing or you conclude after some searching that finding it would be a waste of energy and make your judgment based on the evidence you already have, and either way, that doubt is gone.
If you still harbor doubts, in his sense, that Christianity may be true, you should search for that missing evidence immediately or conclude that the effort to find it isn’t worth it and assign the likelihood the ridiculously small probability it deserves. Notice that I did not say that you should claim with certainty that christianilty is false; predicting anything with true 100% certainty is, for a bayesian, truly stupid, because on the absurdly small chance that you’re wrong, you lose the game, having just conceded that you assigned your life a likelyhood of 0%.
James Bach, if science is the belief in the ignorance of experts, science isn’t a good in itself. If the experts aren’t ignorant, then we don’t need science anymore. If we know all the answers then why in hell do we need to learn?
Learning is good because it destroys doubt, doubt isn’t good because it enables learning. That perspective is incredibly wrongheaded.
Besides being a singulitarian, life-extensionist, transhumanist, and cryonicist, I am a believer in the intrinsic divinity of koala bears.
Not sure why EY redefined the debate in terms of cultishness. Was anyone under the illusion they were being asked to pack their things for Guyana ?
Doubts about the objectives of the SI arise more from the seeming contradiction between the professed rationality of its members (Bayesian rationality, weighing the risks, putting all the ‘Friendly’ safeguards in place etc.) and the passion with which in their writings they seem to hail the Singularity and radical life extension like the Second Coming. Which leads one to fear a certain bias. Fear only, mind you. My slovenly and inadequate heuristics don’t push me into a superhuman effort to get involved.
BTW, the very abuse of the term Bayesian, except humouristically, is in itself worrying. It’s only a statistical method for Chrissake. Very useful in well defined scientific investigation, of no use at all in areas where the priors are (a) innumerable (b) inestimable, like, in all areas in the ‘humanities’.
BBTW : The word ‘Singularitarianism’. Any word ending in ‘-arianism’ denotes a belief system, no ? So using that word does indicate that its users have gone beyond the domain of ideas and are in the domain of beliefs.
The real fear is not that Singularitarianism is a cult, but that it is pseudo-science (with certain practical consequences), like ESP, Velikovskianism, or certain false nutritional beliefs.
A commonly proposed solution is to look at the evidence with a scientific (Bayesian?) mindset, but most of us are woefully unqualified to judge most scientific fields without an intensive study that we are not about to engage in.
Dilbert: Dogbert, I don’t understand why you, or anyone else, would become a vegetarian. Dogbert: You mean, why don’t I take dead animals, cook them until they become carcinogenic, then eat them instead of something nutritious? Is that your question? Dilbert: Exactly. Is there any good reason? Have you joined a cult? Dogbert: Apparently.
Oh, and the TV Tropes Wiki is definitely a cultish Happy Big Idea. They even admit that TV Tropes Will Ruin Your Life. ;)
denis bider, I thought Eliezer’s use of “the one” was a deliberate echo of a rabbinical or Talmudic idiom, though I’m not sure how I got that idea and my google-fu isn’t sufficient to verify or refute it. … Ah, but take a look e.g. at page 8 of this book.
That was a really good post.
However: I suspect people don’t really mean “is this a cult” when they say “is this a cult.” And they don’t mean “please give me reassurance of my own rationality” either.
Rather—and I’m introspecting here, so these intuitions might not generalize—it seems like “is this a cult” means “is this a really tricky system of self-supporting irrational beliefs?” Or at least that the question “is this a cult” could mean that, if we interpreted it charitably.
If that’s correct, it’s not a question about the behavior of the people involved, nor about the presence or absence of certain kinds of biases (directly) but about the way the beliefs interact. For example, one belief that a lot of cults encourage is the belief that outsiders who deny the belief are trying to persecute the cult. That belief obviously lends strength to attempts by humans to hold all the other beliefs, just as the other beliefs (e.g. that the beliefs were given by revelation) lend strength to the attempt to hold the persecution belief.
Just a random speculation I’d like to toss out.
In more condensed language, with a different spin, I think what people are worried about is “If I append this group to my identity, will it cause people to dismiss my thoughts and arguments?”
If you tell most people you’re christian, it doesn’t cause them to tune you out immediately (except in certain subcultures) because christianity is an accepted influence on our culture. If you tell them you’re anticipating the singularity...well, all bets are off.
So I suspect most aren’t fearing being wrong, they’re fearing no longer being credible to the people they normally interact with.
That’s why groups whose beliefs have been around long enough to seem “normal” don’t inspire the same nervousness as “cults”, though some mainstream religions may also take all your money and send you to a monastery. You can make some inferences about a belief system that has been around for a thousand years, compared with one that was invented last week. At minimum, the former is not likely to kill off a large portion of its believers, whereas something new could easily turn out to be the People’s Temple or Heaven’s Gate. With a time-tested brand name, you can tell in advance what the likely outcomes are going to be (assuming constant conditions, of course). In fact, if your society and all your ancestors managed to survive by holding on to their particular set of beliefs, it might be quite dangerous to depart from those beliefs. This is the conservative rationality-of-irrationality argument, which goes back to Edmund Burke (at least).
I’m not a conservative, but having reached an age where I should be turning into one I can at least appreciate the argument. Fear of the new, strange, and untried seems like a very useful survival heuristic, which is not lightly tossed aside. Yet nobody with a functioning brain in today’s world wants to be a Burkean conservative. So we are all trying on new ideas for size, often by joining up with others who have commitments to these ideas. Naturally the first thing you want to do when faced with this step is to try and figure out the nature of those commitments.
Asking a member of a group if it’s a cult seems a bit weird to me, but maybe it’s a good probe—they could get indignant; they could patiently explain that no, they simply believe in the Truth; they could get surprised if the thought hadn’t occured to them; or they could laugh and say “well, it has some cultish elements, but...”. I think it’s only the last reaction that would make me comfortable joining up.
I think we can synthesize what Burke was saying into a more comprehensive theory that doesn’t lead us into stasis, simply by saying something like this: “We must come up with new ideas that are better than the old ones—but we must MAKE SURE they are better before we implement them on a large scale.”
ESY plays Go! Cool! Now I’m super-certain that his Words of Wisdom are the Font of All Knowledge.
Sorry, I don’t play Go in the sense of having played enough games to be anything remotely like good; I only play Go in the sense of knowing the rules and having tried my hand at, oh, maybe fifty games total. This is one of the main things I would do more of, if I thought I actually owned my time.
I’ve played a lot of go. You may be missing out.
Go is a great brain exercise, with lots of cross-domain lessons in it—though most are to do with zero-sum games. On the other hand, as with many good things, it can be kind-of addictive.
Are nations cults? It depends, as with other sorts of organizations, but I think any group which uses “serving” describe taking a serious risk of death or damage by following orders from the group and getting a sweet deal for being near the top of the hierarchy is looking a little fishy. Likewise if the sort of group which makes pretty effective claims to deserving people’s primary loyalty and which is the sort of group that is most likely to get people killed doesn’t even show up in an intelligent discussion of what conventionally accepted groups might be cults.
Nancy, I’m confused by your sentence about “serving”; are you talking about how both soldiers and politicians are said to “serve”? or are you talking about how people get status points for becoming soldiers? (or police or...)
I think the main use of the word “cult” is something like “illegitimate source of authority.” This explains both why “legitimate” sources of authority are similar and why no one wants to call them cults.
But they’ve got a big supply of legitimacy, so they don’t have to do as much nasty stuff as cults. Yes, nations kill a lot of people, but not that many per member. Joining the military is probably a better idea than joining a cult.
Most people don’t want to be weirdos. They care what other people think, even when they know those other people are wrong. Even the Hermiones of the world. Harry shouldn’t let it get his knickers in a twist.
A funny thing I noticed about myself when reading this article, the last part of it. When I read this sentence:
I momentarily thought “And rational behaviour does not, therefore, it is not a cult.” And less than quarter of seconds after, I felt on the back of my chair, holding my head and screaming like it hurt (I think it really did, but it was just a sort of placebo/nocebo effect). When the not-pain released my head, I thought that I did not understand the point of the post on a gut level, if I’d allowed myself to think like this even for a moment. If my brain plays tricks like that, and I notice it just because I read about it very recently, then how I can be sure It won’t play them when I encounter an actual nonobvious cult next year?
Really, you can’t be sure. We run on corrupted hardware.
That said, I find that quasiregularly asking myself why I believe what I believe does help manage the uncertainty.
Five years and a half later… ;-)
I’m an atheist. I received a fairly good training in math and science through the end of high school and am majoring in biology. I spend a lot of time with a Christian group of people near my college campus because I experience my school as incredibly un-interested in building community, and I can’t get most people to talk to me consistently even after approaching them being friendly, reaching out to them, making efforts to go to activities in which we have common interests, etc. (It’s also hard because I have disabilities of a sort that make socializing somewhat difficult.)
The people in this Christian group are by far the kindest people I’ve met, amongst themselves and to other people. But it is very jarring to end every meeting with “In Jesus’ name I pray” and makes me nervous that I might end up sacrificing my rationality if I spend too much time among them. (This is especially the case after, during a really difficult period in my life, I had a month where I believed that the Christian version of God might be real, and didn’t notice anything concerning about it until a chance event cracked the belief slightly. After that, I forced myself through a probing crisis of faithfor several hours to remember why that actually doesn’t make sense, given my own beliefs about how the world works.)
This article is really helpful, in terms of outlining what to be aware of, and what might indicate that the group actually has too many cult attractor properties for me to continue with it. I first read it in 2011 as a young adolescent, and I’ll admit, I never suspected back then that it would be relevant to me. (Which begs the question of why I thought it was a good use of time to read through all of it, but eh, the rationality was not yet strong with me.)
Christian groups are usually pretty hit-or-miss. If you tear the religion down, crack open its bones, and scoop out the marrow, you’ll find a lot of the same lessons as are discussed here. It’s old, often obtuse, and it’s obvious that the writers and compilers weren’t sure why it was this way, only that it is. Jordan Peterson, for example, has some excellent dissections of various parts of Christianity and what it tries to achieve as viewed through the lens of modern psychology, and it’s hard to look at any of the pieces and say that they are bad. Because they work.
But a lot of churches don’t do that. They get caught up in the mysticism and never look further. The really bad ones will criticize anyone who even notices the practical, good effects for being “worldly”. Don’t waste your time with those ones. They’re just a mutual admiration society and any actually beneficial effects on their lives are purely incidental.
This seems like a good place to mention the Bonewits scale (devised by a guy named Bonewits, whose name is perhaps too perfect for this) for evaluating the danger level of cultlike groups. It’s for evaluating an organization against 18 criteria like “censorship”, “isolation”, and “dropout control”; higher scores indicate a more dangerous group.