How to talk rationally about cults
In 1978, several hundred people in Jonestown drank poison and died, because their leader told them to. Not everyone who died there was a volunteer; some members of the group objected, and they were killed first by the more dedicated members, who killed themselves later. Also, many children were killed by their parents. Still, hundreds of people died voluntarily, including the leader of the group, Jim Jones.
This is an extreme case. There are much more groups that create a comparable level of devotion in their members, but use it to extract money and services from the members. Groups where new members change their personalities, becoming almost “clones” of their leaders; then they break contacts with their families and former friends, usually after trying to recruit them for the group, or at least trying to extract from them as many resources as possible for the group. The new personality typically has black-and-white thinking, responds to questions by thought-terminating clichés, and doesn’t care much about things unrelated to the group. Sometimes the membership in the group is long-term, but commonly the members are worn-out and leave the group after a few years, replaced by the fresh members they helped to recruit, so despite the individuals change, the group remains.
Talking about cults is difficult. It is not simple to provide a definition—in different contexts finding new friends and growing apart with the old ones, spending money on hobbies or causes, and gradual changes in personality could be a part of normal life. Even if this all happens much faster and more extremely, is it just a question of degree (and how much exactly is too much?), or is there something else?
Of course, a group accused of being a “cult” will object against the label and point out the differences between them and some archetypal cult they are being compared with; or insist that there are no cults, only people intolerant of different lifestyles. They will emphasise that their members are in the group voluntarily, so no one else has a right to decide what would be better for them. And for every strange norm or behavior, there is always a good explanation (or a denial).
On the other hand, people will use the label to call any group they dislike a “cult”. Some of them will use it as a synonym for “a religion other than my own” or “people who have a weird hobby”. And of course any objection will be met by: “And that’s exactly what a cult would say.” (Which could actually be true, but that does not exclude the possibility that both sides focus on a wrong thing.)
In absence of clearer criteria of what constitutes a “cult”, or more precisely, what kind of personality change by group influence is that kind of change we worry about… what kind of influence can change normal people into blind servants of the group without any obvious coercion… the debate will be very confused.
Also, for people who don’t have an experience of being a cult member, it may be difficult to imagine what is it like. Many people imagine that being in a cult is a thing that can only happen to other—read: stupid and gullible—people; they underestimate the emotional forces involved. (Ironically, if they later become cult members themselves, the same belief will make them argue that their group is not really a cult.) But even the former cult members may have a problem to pinpoint the essence of the problem, instead of focusing on some accidental attribute of their former group. It is not rare that former cult members join a different cult later.
Luckily for our debate, there are relatively objective criteria for the cultish form of “mind control”. There are a few “red flags” you can learn to notice and evaluate. The list called “eight criteria of thought reform” was compiled by Robert Jay Lifton, an American psychiatrist who studied “brainwashing” of prisoners of war, and indoctrination in totalitarian regimes.
As a rule of thumb, it is better to ignore the specific beliefs of the group, and focus on its behavior. Debating the beliefs is a red herring. There could be two groups worshiping the same sacred scripture, and yet one of them would exhibit the dramatic changes in its members, white the other would be just another mainstream faith with boring compartmentalizing believers; so the difference is clearly not the scripture itself. There could also be two groups with nominally different beliefs, but otherwise strikingly similar behavior. The beliefs don’t even have to be religious per se; the group can also believe in a political reform, or an economical success in the latest pyramid scheme.
It is best to pretend to be an alien zoologist, studying the homo sapiens species, unconcerned with object-level human beliefs. Notice when the members wear the same dress, but ignore what specific dress it is. Notice when the members are required to repeat specific words every day, publicly or privately, but ignore what those words mean. Notice when the members consistently behave remarkably friendly to a new recruit, and avoid contact with former members, but ignore their explanations why exactly are they doing that. Et cetera. Observe what they are doing, not what they believe they are doing.
Sometimes a group matches a given criterion partially. Sometimes the group matches some of the criteria, but not all of them. There is no exact line unambiguously separating “cults” from “not cults”. But that doesn’t make the concept worthless. There are groups that exhibit the typical cult dynamic, with all the toxic aspects reinforcing each other. There are also groups that have some unhealthy behavior, but it doesn’t create the full pattern. Even the same group can change over time. To evaluate a given criterion, it is good to have an idea how far does the scale go in a typical cult (to avoid the “medical student syndrome” when people find every symptom in everything). Probably the best way to get the idea is to read a few biographies of the former cult members.
So, here are the criteria:
Living in a controlled environment
There are rules for how you should spend your time. Some groups insist on wearing a uniform; some groups want you to spend the whole day in presence of at least one other group member, to make sure you don’t privately break the rules. Sometimes there is a lot of work to keep you busy—whether real work, or just repeating a mantra thousand times a day, -- so you don’t have free time to reflect on the group and your membership. (People sometimes spend months or years in a cult without actually making a conscious decision to join it permanently; they just decide to hang out with them for a while to learn more about them, and receive a ton of “urgent” work with a promise that all their questions will be answered later, when there is more time. Somehow, that never happens, and only after a few months or years these people realize they were duped and leave.)
More important than controlling the physical space is controlling the mental space. All groups agree that you should avoid reading materials critical of the group, and talking with people critical of the group (unless you are trying to convert them; or you are exposed to them strategically, because they happen to be stupid critics, so the real goal of exposure is to teach you that this is what all group critics look like). Most of all, you must avoid talking with the former members (they know too much).
Of course, this is all for your own good. People opposing the group are the worst people of all; it is rational to avoid spending your precious time with them. They are wicked heretics, slaves to sin. They are racists, sexists, capitalists, or communists; the evil ones who keep ruining our society. They are brainwashed sheep who are too scared to think out of the box, join our latest pyramid scheme and make millions. The former members were too sinful or weak to stay in the group; you better avoid them, because the weakness might be contagious; and don’t believe anything they say!
This also applies to your internal environment. If there is a voice in your head telling you to slow down, to take an outside view, to consider the alternatives… you are supposed to fight against that voice, not listen to it.
Sometimes the protection against outside information happens on two levels. There is information that the group will officially admit they are trying to protect you from, such as pornography or harassment. They may offer to install a group-sanctioned web filter, or otherwise let you outsource the information filtering to them. If you let them, you usually get more than you subscribed for. For example, the anti-pornography or anti-harassment filter will also filter out information and people critical of the group, without telling you it did so. (Now good luck trying to convice your employer or university to remove those filters again.)
Guided by an intelligence higher than you
There is something special about this group and its leadership, which makes them unique in the universe and in the history. If you tried to make another group like this, it just wouldn’t be the same. You couldn’t even change it significantly, because it was designed exactly the way it needs to be.
If this is a religious group, the answer is obvious: it is the supernatural power on your side that makes the difference. But even nominally non-religious groups often believe in things that are supernatural in anything but name: the inevitable flow of history, the spirit of time. The non-religious leaders may nominally be mere mortals like you, but they are mortals endowed with unique understanding of the forces of the universe, which makes them incomparable to you; you couldn’t replicate (or surpass) their wisdom no matter how much would you study. Similarly, you couldn’t devise your own pyramid scheme, or invent a better product to sell. It is insane to even think about such things.
When your group is guided by a higher intelligence, the only rational thing to do is to obey it without a doubt. Even weird things that happen (for example, one day your leaders predict the end of the world, the next day they deny doing so), happened for a purpose; you are just too stupid to understand it.
What do we want? Perfection! When do we want it? Now!
The perfection is not only achievable right now, but you are a horrible person for not having achieved it already. What stops you? Everything is either completely good or completely bad; just throw those bad things away! Unless you secretly prefer to be evil.
(Of course, deep inside you will always be aware of your own imperfection, as measured by the group standards. Just like every other member of the group. But that’s not a bug, that’s a feature! Feeling guilty will make you work harder.)
The same black and white standard is also applied to everything outside the group, usually by finding a flaw, and concluding it is too horrible to interact with. It creates a perfect excuse to avoid anything or anyone inconvenient for the group leadership. It also guarantees a steady flow of perceived enemies, because no microaggression is too small to make a mountain of.
Your mind belongs to the group
There is no privacy even inside your head. You are supposed to confess your sins to the group. (Note: There is always something to confess. Believing you have nothing to confess is itself a great sin.) Confession is best done publicly, in front of the whole group. If you don’t volunteer enough sins, the group is supposed to call you out. Snitching on each other is a valuable spiritual service to your fellow members. The most poweful confessions are ones that leave the victim broken and in tears. Then the victim swears to sin no more, and the group provides an absolution. This creates a strong emotional bond between members.
A sacred science
The teaching of the group come with an authority of science. Except, this is a special kind of science; one that does not allow doubts. The truth is not determined by experimentally comparing competing hypotheses; it is revealed. The leaders of the group got everything right for the first time. Their work is perfect. There are no alternatives to consider. Skepticism is evil.
Redefining the language
You can either invent new words, or redefine the meaning of the existing ones. This serves multiple purposes. On the outside, it creates yet another barrier between members and non-members. It is impossible to get a second opinion on group teachings, if non-members don’t even understand what those words mean. (And if they don’t understand, that obviously makes them less smart, right? More words mean more wisdom.) Every moment a member spends with non-members, these artificial barriers in communication keep reminding them they are in the outside world. The group becomes the only place where some concepts can be discussed meaningfully.
Redefining the language also brings specific advantages in teaching the group beliefs. You don’t really have to prove anything if you simply make it a part of the definition. For example, if you redefine “heresy” to mean “irrational disagreement with the beliefs of our group”, you don’t have to prove that people who disagree with your beliefs are irrational. It’s right there, in the definition! Or you can redefine words “X” and “Y” to mean “X, but only when our members do it” and “Y, but only when non-members do it”, and suddenly it is obvious that when both members and non-members seemingly do the same thing, it’s actually not the same thing.
Short phrases can be used to stop thinking about complex topics.
Map over the territory
The group doctrine says something can’t happen. But it happened to you, maybe years ago, maybe right now. Guess what: You are wrong! It didn’t happen. And you better change your memories; either find an interpretation that fits the group teachings, or just deny the whole thing. Because the group is always right.
Often, among the memories that need to go, is anything good that happened to you before you joined the group. The official story usually requires you to tell (and believe) than you were a horrible and worthless person before you have seen the light.
If it’s not here, it’s not real
The worlds inside the group and outside the group can’t really be compared, because one of them is not even real. People inside the group are saved. People outside the group are doomed, and they deserve it; they are evil and weak. The people outside are only valuable as potential future members.
This of course creates the threat that if one day you would lose the group membership, you would stop existing, too. Not literally disappear, just stop existing as someone who matters.
The important thing here is to realize that these eight points aren’t just eight arbitrarily selected things, but rather eight aspects of the same complex mechanism that hacks the human brain; makes it value the group highly, makes one feel dependent on the group, and builds a barrier of fear against leaving the group. They often reinforce each other: The argument by higher intelligence can be used to explain why the rules exist; the demand for perfect purity includes requiring perfect obedience of the rules; the group confession is there to detect transgressions; and the controlled environment even removes the opportunity for transgressions. Similarly, black and white thinking classifies your former friends as evil; the group confession (where you must report “sinful” topics you talked about, such as your former hobbies, or how the group membership keeps changing you for worse) makes meeting them more costly; the new language makes communication more awkward; rewriting your memories requires you to give up good memories experienced with them; and pretending that the world outside the group is not fully real makes you rationalize that nothing of value was lost.
When people without their own personal experience talk about cult membership, they sometimes describe it as a rational preference for things offered by the group to its members, such as the social environment or the feeling of meaning. While this contains a grain of truth (a group offering its members literally nothing probably wouldn’t be popular), focusing on the value provided by the group ignores the more important part of the picture: how the members are strategically deprived of potential value coming from all alternative sources, so the group becomes a monopoly on value, threatening to take it all away as a punishment for disobedience. Of course, surrendering to the group demands only makes one more vulnerable in long term.
How do such well-designed abusive groups appear at the first place? I believe a part of the answer is selection bias: abusive groups with worse designs remain small or gradually disappear. Other part is that some of these techniques come naturally to abusers (such as: isolating the victim from their family and friends, or accusing the victim of hundred made-up “crimes” so the debate shifts to talking about what the victim did wrong), and the only technical problem is building the group structure that will make other members play along with the abuse (especially considering that they will also receive the same treatment). And yet another part of the explanation is that successful groups copy these “best practices” from each other, because they see greater group cohesion as a worthy subgoal. (Here is a nice video.)
So, what can one do to avoid a similar outcome? Some people suggest full reverse stupidity: a sufficiently safe group should consider itself and its members completely worthless, should avoid having any specific opinion on anything, and preferably shouldn’t even exist. Anything else would be the first step towards the inevitable horrible outcome. But looking at the eight criteria, I believe there are ways to have a functional, yet non-abusive group, if we simply avoid scoring too high at any of these points.
A surprisingly large part of this could be classified as some aspect of “free speech”—being able to talk to anyone (even critics of the group, and former members), about anything (even doubting the group beliefs, debating personal experience that contradicts the group beliefs, expressing doubt in ways the supposed higher intelligence guides the group, and explaining group jargon to outsiders), and keeping the content of such debates for yourself. The rest could be described as quantitative thinking (accepting that some things can be better or worse than others, without that inevitably making them perfect or horrible), and cooperating with people outside the group. As a rule of thumb, a group that makes fun of free speech is an abusive group; some of those things you are not supposed to discuss are about how they treat people, including their own members. On the other hand, a problem you can talk about is a problem you can try to solve.
From Funereal-disease on tumblr, in a previous discussion: It is usually better to talk about “spiritual abuse” rather than “being a cult”. It emphasizes that the techniques of successful cults are techniques of successful abusers, and is better at being something that happens to a greater or lesser degree; cult is more binary.
I might prefer “social abuse” or “community abuse” to make clear that non-religious forms are possible.
I agree. This kind of abuse is a matter of degree, and exists also outside of religious communities. You can have a political cult that could be explicitly atheist, or an economical cult which is nominally about making money (i.e. an anti-religious or an a-religious group), which would still follow pretty much the same template.
I don’t want to debate the labels here. (Not that I deny the importance of good labels, but because such debate could get us far from the original topic, e.g. into discussing the trade-offs between labels that fit better but you need to explain them to everyone vs labels that just point approximately in the right direction but people quickly recognize them, etc.) But I’d like to mention that Robert Jay Lifton, whose model I used here, calls it “thought reform”.
Previously on Less Wrong: gwern’s “Notes on Brainwashing & ‘Cults’”. The summary in his own words:
Thanks for that link! I missed that one, and it is pretty good.
I think Nancy’s response is the one I personally resonate with.
This point is more about framing, and it connects, for me, to covid and so on. The central issue in both cases seems to be that some people who consider “the herd” to be the only object of value and the only way to really justify “policies or norms or advice”.
By contrast, I think a herd that doesn’t serve its members in a clean and honest way is simply not a good herd. Either a “functioning herd” unpacks logically into a large number of small observable countable “happy lives full of happy events” or else the herd is shit. Make the lives good, and the herd is good. Done.
I. For Example: Masks Working Locally But Not Globally
Do “masks” work for covid?
On one level, the level of keeping viruses away from “one’s own” personal airholes… masks just OBVIOUSLY work to keep your body momentarily safe from being infiltrated by viral particles larger the the very tiny holes in the mask that let through individual O2 and N2 molecules (but not larger things like sooty carbon-based smoke particles (which are themselves smaller than even bigger viral particles)).
On a bus full of coughing people: properly manufactured and tested and worn masks are obviously selfishly useful (and make a positive contribution in a suite of similar precautions).
Now suppose someone only cares about the herd, and their only possible intervention is to go on TV and say “Yay for masks, m’kay?”
Then we look at whether the words being said on TV causes covid to not be endemic… and we expect a negative result. So “saying yay” here… might not be a cure-all for society. Why will it not work? Many reasons.
Some people don’t even have TV to hear the message, and some people rightly distrust most things the TV tells them (it fully of lies and ads and partisan bickering and infomercials after all), and some people just don’t believe in the germ theory of disease (weirdly, many microbiologists don’t believe it in practice), and some people never get to Piaget Level 4 such that they can “understand and believe ANY theory” in a meaningful way that links abstractions to concrete details in a rigorous fashion.
So there will be failures in adherence at adequate protective practices in “parts of the herd of people”… and then the thing is infectious, so it races through some of the herd, but maybe not all of it, focusing mostly on germophobically incapable folk.
Then it could persist in cliques of people that contain a member whose carefulness is not completely adequate eventually… and then hop from clique to clique forever, with spikes. And in general the herd will NOT be protected by the mere statements on TV having been uttered.
On this reasoning, “masks don’t work” is actually mechanistically sensible… if you think of sociological predictions as a mechanistic practice.
(To actually eradicate covid would take roughly two interventions: (1) regular mass testing and (2) involuntary quarantine of the infectious. Then maintenance involves testing and quarantine at ports. Masks aren’t on this lists. Neither are vaccines. These other things are both just ways to selfishly cope with covid in a endemically infected region of the world whose government can’t do the two things that actually would work.)
II. Totalizing Social Groups Devouring Individuals But Not The World
I think something similar to the covid model is likely true with high-demand (totalitarian? totalizing? monopolizing?) social groups that mostly recruit a small percent of the mentally vulnerable and use them up after two to five years.
These high-demand groups are not dangerous to the herd necessarily.
If herd level interests and actions are your only lens for thinking about goodness and badness, then gwern’s position is that “cults are not bad” (and I hear this with the coda ”...in the same way that masks are not good”).
But in the meantime, cults are probably not safe to join for actual people whose lives are individually worthwhile, and who deserve the modicum of practical caring concern that each individual human naturally deserves just for being a human in a healthy and functional society (which ours is maybe not (because many don’t get such treatment)).
If the cult said on its tin what it does, and then did what it said on the tin, would the people have joined? Probably not.
So what happened might not have been a LIE, but it was literally “mis-leading”. The people were led… BADLY. They trusted someone to guide their behavior… and then regretted it.
III. Overreacting To Extremes When Normal Badness Is Common
A deeper thing: I’ve been told by two different CEOs of two different startups I worked at, in speeches they gave to the entire company, that working at their startup was “the most important thing that any of us will ever be involved in”.
At least one of them was wrong, but probably both of them were wrong, and this kind of thing seems to be part and parcel of very much of modern American culture… it just happens to be rife with insanity and lies.
When I heard this line, both times (but more the second time) I just rolled my eyes and remembered to cash my paycheck.
In retrospect I feel that I learned lots of TECHNICAL skills on the job that I wanted to learn (probably because wiser people than I was at that time all passed on these jobs), but in addition to this I learned a lot about business functioning and dysfunction by direct observation (hanging out just below and outside the C-suite, working as a data scientist)...
...but I imagine that a lot of people (perhaps less prone to going meta and making theories) do see in glimmers of these same general kinds of “crazy CEO deals” in their own lives, and vaguely undertand how broken a lot of things are in our post-post-modern “normally accepted everyday life”...
...and then they react to slightly more extreme things than they see in their daily experience with only “normal amounts of brokenness in it” by over-reacting to the specific extreme instance …which seems like maybe a sort of a way to try to cope with the totality of an imperfect global experience?
Maybe overreacting to new vivid instances of a bad trend is useful sometimes? Who knows! Not me. Not for sure ;-)
What is the word “rationally” doing in the title? What does it add to
If it adds nothing, perhaps it would be better gone.
And what is “talk” doing in the title? The article is mainly about cults on the object level. It is a little bit about the difficulties of language and categories, but that applies just as much to thought as to speech. And a little bit of the article is about the adversarial nature of some conversations, which is what I expected from the title. This relates back to the conflicting uses of “rational” — belief and action.
I guess a better title would be something like: “Towards a gears model of ‘cults’”.
The idea is that when people talk about ‘cults’, there is a legitimate substance they want to address (some kinds of abuse happen in some groups, in ways similar enough to suggest that there is an important cluster in the thingspace that deserves our attention), but they typically pick some accidental attribute and blame it for the whole outcome.
For example, people can use “they talk about religion/spirituality (outside of the mainstream religious framework)” or simply “they are weird, and they meet in private” as a predictor for that kind of abuse. Which is wrong in two ways:
First, various non-abusive groups get accused of being “cults” (which connotationally means: “if they are not abusing their members already, it is just a question of time”) merely for being weird, meeting in private, and talking about something outside of mainstream; such as Dungeons and Dragons players. That creates additional social pressure against doing anything out of ordinary. Also, see the paranoid reactions to the proposals of Dragon Army project, or even the Solstice celebrations.
Second, various truly abusive groups can use the wrong model to deflect a legitimate suspicion. If they are not religious—or if they can relatively plausibly deny being religious—they can say “we are not religious, therefore by definition we can’t be a cult” (and then the attention turns to debating definitions, which is useful, because it turns away from the actual evidence of abuse), or they can spend some money on PR to stop being “weird”. Or they can point to the former category and say: “some people call Dungeons and Dragons a cult, but that is crazy; people also call Scientology a cult, and obviously, that is the same kind of crazy.”
So why do people use the wrong models? Because they don’t have a better model, duh. (Most people don’t have an experience of being a cult leader, or talking with former members of various different cults, to get the full picture of how it works when you abstract away all the accidental details.) More meta, people usually don’t understand the difference between better and worse models. Is “cult” a mysterious answer, or is there an actual mechanism proposed?
This article is an attempt to describe the mechanism. To answer your question, the mechanism itself, that’s the “mainly about cults on the object level”. But the idea of using a mechanism at all, that is the “rational” part of talking about the cults.
So call it, “what behaviour people worry about when they are worried about cults”
Since downvoting is disabled, I’ll criticize you instead.
You’re presenting the classic anti-cult narrative that is being repeated since the eighties and that is available on the web in thousands of places. In fact, I would not be surprised if it turned out you copied and pasted much of this. This has no obvious relevance to LessWrong and your attempt to restate this outdated narrative in LW lingo does not change that.
A few more substantial criticisms: Jonestown, your only actual example has always been the extreme exception (in modern times), the 9/11 of cults. There are a few other much smaller examples of cults violence, but most cults are very different from that and much less extreme than you describe. They are really mostly a waste of time that people stay in because of the sunk costs fallacy. Since this narrative you copied was created, the number of cults has gone down noticably and their members’ average age has gone up. The ones that remain perpetuate themselves mostly by having children, rather than “brainwashing” new members, much like other religions do. And leaving is generally easy, except if you have other family members inside.
This sentiment indicates to me that LW needs a bit of a culture change. It’s a decent article, I feel informed by having read it. Probably not perfect and not a Yvain level insight about the world. But why do you want to downvote it? Couldn’t you just not upvote it?
I want to downvote it because it lazily rehashes outdated clichés.
This type of description of “cults” has always had a bunch of problems. Let’s be generous and disregard the “cult” label (although it is entirely discredited in the scientific study of what is now referred to as New Religious Movements) because we can replace it with some other word. Still, this does not look at actual existing cults at all. People’s Temple self-destructed almost 40 years ago. There are thousands of other cults (tens of thousands if you include Asia) and this description disregards all of them. It has no basis of data whatsoever.
What it has is a “checklist” of criteria that are very fuzzy and offer no clarity on what is or isn’t a cult. All these do is provide a lot of threatening language to reinforce the idea that cults are dangerous. Which is not a proven fact. There’s solid evidence certain specific group have certain specific dangers—Scientology is the big one. But “cultishness” in general, i.e. basically religiosity with heightened tribalism, is not established to be dangerous. [Edit: Not established to be more dangerous than mainstream religion.] And this type of “cult checklist” narrative distracts from this simple fact by just piling vague threatening assertions onto vague threatening assertions.
I would downvote this anywhere, but on LW, where we’re supposed to think critically, check our sources and believe only what we have good reason to believe, it seems particularly inappropriate.
I agree with your definition of “cultishness” as “religiosity with heightened tribalism.” I think it is very, very obvious that this is more dangerous than mainstream religion and not something that needs some special method to “establish.”
Well, that depends what you mean by “mainstream religion” then, doesn’t it? I mean, obviously Taoism, Buddhism (most varieties thereof, at least) and even Sufi Islam are not particularly dangerous, but some mainstream religions are in fact intensely tribal.
Many propaganda pieces make a person feel informed by reading them.
They write propaganda, you spread awareness, I fact-check. Is it possible to rigorously define the difference between these, or do they mainly vary by connotation? If the latter, perhaps it’d be better to stick to labels like “true” and “false”.
There’s writing that makes a person felt informed after reading it by giving the person easy answers to complex questions and there’s writing that tries to communicate complex facts about reality. Both can be right or wrong.
The standard of “feeling informed” is bad for judging the quality of a political argument. Plenty people feel informed after watching Zeitgeist.
You can make people feel informed when you tell them it’s all due to the Jews, but that’s no justification for the political speech and the created feeling in no way justifies the political speech that’s used for persecution. And this political narrative in the OP is used presently in France for persecution of organisations like the Landmark forum.
It’s possible to fool people’s sense of “feeling informed”.
For instance, LSD seems to often induce a sense of insight and significance … including sometimes attributing cosmic meaning to the patterns perceived in the pebbles in a concrete wall.
Or, for that matter, as some of the psychological studies described in Cialdini’s Influence or Kahneman’s Thinking, Fast and Slow appear to have failed to replicate, what is there to say about the sense of feeling informed that accrued to many of us who took them to be insightful?
I made a summary of what is written out there; didn’t try to invent anything new, other than using words that felt more natural to me. Maybe it’s too obvious and well-known, but I don’t remember it being mentioned here in the past, when it was debated whether LW or solstice celebration or whatever was cultish.
I know a few people who got involved with cults by having their boss converted to a cult, and then receiving an option to either convert too, or lose the job. Happened about a decade ago.
These days, I believe political cults are the most popular among young people. SJWs have all the red flags, including the disfellowshipping of former friends who disagree with their sacred beliefs. I believe it is useful to remind people that what they see is actually just another instance of an old pattern.
Maybe in the West. But Da’ish and other varieties of militant Islam are basically doomsday cults, and have all the usual marks of same. Note that there are in fact flavors of “strict religion”, even in Islam (consider the “quietist sects”), that are not nearly as dangerous in practice, either to the individual or to the surrounding community—and the “red flags” seem to make all the difference there. A “quietist” Muslim might know that he’s supposed to “pray five times a day every day, no matter what you were doing at that time” and “shun the infidels” at some level, but he won’t take these things nearly as seriously as someone who’s actually dangerous—the “cult” is not totalizing for him and real-world concerns will obviously take over at some point.
Heh. When chaos said “the 9/11 of cults” I thought “wait, wasn’t 9/11 the 9/11 of cults?”
Is your objection really that the topic has no relevance to LW or that because the information is found in so many other places that it has no relevance?
I appreciate summaries on LW even if they are found elsewhere because it provides for comments and discussion from a very particular group whose input which I prioritize(over other internet strangers). I often do a quick search on LW for new ideas I am exposed to, to get the LW spin. Say you just discovered this forum and you decided you like how everyone aspires to be a rationalist, but you have gaps in your knowledge about cults, this article might be far more informational than what you can find on a Google search. A Google search on cults leads to lots of websites on christian apologetics, not exactly the places I would encourage people to go to find truth. The information can be found in thousands of places but the places matter– a rationality oriented forum vs a website you are not quite sure of it’s motives.
That’s exactly my point. The information posted here is a reformulation of exactly the type of material at Christian apologetics sites. It does not deserve to be in a place where you would encourage people to go to find truth.
I don’t read Christian apologetics sites per se, but I have read some cult-related materials published by Christian organizations, and the models they produce are quite different.
Instead of focusing on behavior, their explanations are theological. Their model seems to be rather: “These people worship a wrong god, or worship the right god in a wrong way, and that causes the abusive behavior. To avoid abuse, stay within our officially approved religious organizations.” Even the economical cults are shoehorned into this model, by saying they “worship money” and then explaining why that is a sin.
Sometimes, however, those Christian organizations also quote a behavioral explanation. But if that quote is followed by their own words, they usually put it in the proper context: that all that behavior is a consequence of choosing a wrong theology.
tl;dr—it’s not me quoting them, it’s both of us quoting the same sources; their model is actually different
I understand your criticism much better now.
I appreciate it when people take the time to read and summarize material for me.
To me this article doesn’t feel like an attempt to summarize the views of different authors. It only refers to the criteria put forward by one author (Robert Jay Lifton).
The last good article on this topic on LW was written by Gwern. In it he writes:
Viliam presents the old narrative as the rational way to think about the subject when the prior art on LW happens to be that the beliefs associated with the narrative have been debunked by studies. Viliam doesn’t attempt to reference any empiric evidence for why we should accept the narrative but simply presents it uncritically.
I’m not sure how Gwern’s article is supposed to be taken as a criticism of this post. Its main thrust seems to be that the bulk of “New Religious Movements” do not actually share the “red flags” and “brainwashing” that OP is discussing here—that these are mostly “outdated clichés” as someone else said. And this may well be right, but when we do see groups that clearly use these mechanisms in the real world, it seems quite justified to regard these as exploitative, even when most “new religions” are not.
What LW lingo did he use? I didn’t see it.
Also, I know at least one person who wasn’t born when the Jonestown cult panic ended and got into (and thankfully out of) a cult very much like the one described.
Things I am aware of:
In original it’s called “mystical manipulation”, but it felt to me not obvious enough, especially if we talk about non-religious groups. Members of those groups would probably object that there is nothing “mystical”, only someone being super smart. I tried to provide a description that a member of the group would have a chance to recognize.
In original it’s “doctrine over person”, but the meaning is similar; or maybe ‘group map over individual map’ would be better in some situations.
Viliam is a long time member. Which strikes me as odd that he would bring such a topic.
This is a well-presented article, and even though most (or maybe all) of the information is easily available else-where, this is a well-written summary. It also includes aspects which are not talked about much, or which are often misunderstood. Especially the following one:
Indeed, the beliefs are not even close to be among the most important aspects of a cult. A cult is not merely a group which believes in something you personally find ridiculous. A cult can even have a stated core belief which is objectively true, or is a universally accepted good thing, like protecting the environment or world peace.
I think a more interesting question that comes from this is when we take it to a general level. How do we treat situations where we are certain that individuals are engaging in practices that are harmful to themselves? When do we think it’s okay to use whatever individual power we have to make them change their behavior or beliefs? When do we think it’s okay for the law to force people to do things “for their own good”?
Some examples:
-A friend is joining Scientology and is about to give them all their savings
-A 14 year old daughter is dating a 45 year old man, and certain they are “in love”.
-An acquaintance doesn’t want to use modern medicine. They also won’t let their kids use modern medicine.
-An adult relative who is somehow impaired (dementia, mental retardation, drug addiction) is being taken advantage of with their own consent, such that it is
Current laws are set up such that things like “engaging in sex work” or “smoking pot” are considered so harmful to the self that the police are allowed to arrest you for them. But denying your children modern medicine is fine, and giving all your money to a sex cult leader in the woods is also fine.
Meta: I feel like I am not framing this question well at all, and if anyone wants to reframe in a more elucidating manner that would be helpful.
Is the “denying medicine” thing actually legal? I thought it was a big deal that it wasn’t and Christian Scientists complained about it? (don’t have strong memories either way)
I have a fairly compact handle that I like which is ‘does the group encourage you to replace your internal compass with theirs or to sharpen your own?’
caveat for not being able to tell the difference due to more subtle considerations in some cases, but this screens off most of the harm.
At first I wanted to say that cults consider everything originating outside the group to be wrong (your original ideas, but also ideas you simply learned elsewhere), but then I remembered there are a few exceptions: popular things that feel vaguely pro-group. For example, many financial cults recommend their members to read Kiyosaki, not because he is connected to them in any way, but because he provides motivation without any specific advice, so he can be interpreted by the group as recommending you whatever the group tries to sell you. Similarly: Who moved my cheese?
But yes, a healthy group doesn’t treat knowledge coming from outside as a threat.
I agree with chaosmage, you repeat an existing narrative about cults. I don’t think asking the binary “Is this a cult? Yes or no” question is a rational way to talk about cults. Your article is also ironic in how it tries to advocate a black white criteria for cults and than accuses cults of trying to use black white criteria of good and evil.
If I look at an organisation like Leverage research I don’t think your list helps me to have an intelligent discussion about the way Leverage research works. I also think that it doesn’t help me with evaluating the group dynamics of a particular startup.
How do you know that this is common in organisations you would classify as a cult?
Which cult currently does this? Do you know of any? Have you searched for any empiric evidence before making a claim like this?
It’s also worth noting the context in which the word cult is often used. There are political reasons to judge people who don’t have their prime loyalty to the nation state and it’s institutions negatively.
People who live in an Ashram or in the Leverage group house are likely to feel different loyalties than the average citizen of the nation. As such it’s useful to claim that the inhabitants of either group aren’t allowed to think freely and have their freedoms curtailed by being forced to volunteer private information.
Just like the term terrorist can be used to label a large amount of people negatively, the term cult is also used as a political weapon. It’s the way people get punished in France for having heretical beliefs.
You both say this like it is a wrong thing. Someone else said this before, I repeated it… therefore it is wrong.
If it is wrong, it is wrong regardless of whether it is an “existing narrative”. If it is right, it is right regardless of whether it is an “existing narrative”. So please let’s focus on whether this is right or wrong.
If it helps, I totally plead guilty to repeating things someone else already said. I completely admit that I didn’t invent a new definition of a cult. For a moment, I was deluded into thinking that “scholarship” was a virtue. Now that I admitted that I am repeating existing information, could we please stop talking about whether I am repeating an existing information, and focus on the information itself?
Seems like you might prefer an article that would taboo the word “cult”, and replace it with multiple criteria that can be each independently evaluated on a scale. That way we could conclude that e.g. some organizations only have a subset of the traits typically associated with cults.
What prevents you from asking questions like: “Are people in startup X discouraged from interacting with people outside the startup in their free time?” “Are people in startup X daily criticized in front of their colleagues for mistakes that are more or less unavoidable (which means that everyone gets criticized every day)?” “Are people in startup X told that their lives before they joined the startup were completely worthless, and that outside the startup there is no hope for them?” etc.
I don’t know if it is used currently, but Scieno Sitter has a Wikipedia page.
I don’t give a fuck about nation states, so I don’t see how this is relevant. My issue with cults is that they disconnect people from their families and former friends.
No, the problem is that you repeat something that’s wrong (the old outdated narratives of the 1980′s). And something that’s get’s to used to for persecution.
It’s quite easy to say that having group housing is culty, but that doesn’t matter. What matters is the effects. If you move out of your home town to go the soon founded Accelerator Project, than you reduce personal bonds and get new bonds with like-minded people. I don’t think that’s a bad thing and I think it’s worthwhile if that project goes forward.
I also think that it’s worthwhile for it to have a culture where many private thoughts are spoken.
A discussion that’s productive for designing a project like Eric’s project is much more nuanced about the effects than “cults=bad because Boogyman”.
You didn’t do scholarship.
You didn’t research whether what you are saying is true. This is like saying that because you read somewhere that the Jews are evil, you did scholarship when you bring forward the claim. Maybe they even read a few books about how Jews are evil.
It’s worth noting that Chaosmage (who first voiced the criticism) has an academic background in this area.
Gwern did research in this area and I linked his post here. It suggests that the core narrative is based on many claims that have been shown to be incorrect.
The fact that you uncritically pass on a narrative that’s used for persecution.
How do you know that’s what cults do? Do you have any more evidence than Trump when he says that rapists come from Mexico?
From a political perspective it’s possible to call any development of a strong loyalty to something besides the existing family and the existing institutions “disconnecting with friends and family”.
If you take a person who went to the Landmark Forum, the leave it with the task of apologizing to their family members to develop good relationships with them. On the other hand they also might alienate their family members by speaking in terms of Landmark vocabulary.
JP Sears has a good discussion about a question of viewer who complains that his family was brainwashed by Landmark (https://www.youtube.com/watch?v=1OCxGlJ5mzE).
Landmark is a good example, because it’s relatively well known. I have no personal stakes in the particular group.
Any person who decides to leave their hometown and go to Silicon Valley is likely to reduce their existing bonds to family and friends. Simply by virtue of not being present the person is going to have less loyalty to the existing circle of people.
A Wikipedia page that suggest this was tried in the times of Windows 95.
If that would be true, I don’t think you would be living where you are living. You do have enough loyalty with the nation where you live to not leave it and go to a city in which you would be payed a much better wage for being a programmer.
You might not have an explicit belief of “my nation is important” but when I look at your revealed preferences, they suggest valuing it.
I thought that replacing the black-box word “cult” with eight specific behaviors counts as “more nuanced”. Maybe you missed this part of the article. I even used bold letters for it. Summarizing this article as “bad because Boogyman” feels like you react to… something completely different.
While I don’t have a background specifically in Cult Studies (and I don’t even know if something like that exists), I studied psychology and wrote a bachelor thesis on the topic of manipulation and cults. It used to be my hobby back then. In addition to reading the available literature, talking with former cult members, and talking with local experts on this topic, I also shortly participated in a few shady organizations, got “slain in the Spirit”, got trained how to sell expensive life insurance to naive people (but I never actually sold any), people tried to recruit me into a few MLM schemes (and I cooperated willingly until the moment when I was finally supposed to give them a ton of money), and there is some more stuff I don’t feel comfortable to disclose even now (let’s just say that some people got seriously hurt by some cults for doing similar stuff).
But this all happened more than 10 years ago, and I don’t have time or courage for similar adventures now. So I guess my knowledge is quite rusty, but still felt like not completely worthless. Some details change, some basic facts about human behavior remain.
You are suggesting that in this debate I am the person repeating hoaxes from internet, with zero education or personal experience. Funny that from my point of view, it feels like I studied the existing information and did experiments that confirmed it (which is the reason why I take it seriously), only to be dismissed by some armchair reasoning that people only talk about cults because that’s how the Government tries to suppress all people insufficiently loyal to the State.
But I didn’t want to make this about academic background; I just mentioned it because you started.
You insist on not seeing the difference between “a person decides to do X, which has a consequence Y” and “a group strategically pushes their members into doing Y, which makes the members more dependent on the group, by exploiting some known facts about human behavior”. I guess if you insist that there is no difference, I can’t make you see one.
By a similar logic, we could also say that the mass suicide in Jonestown had nothing to do with cults, because sometimes people commit suicide even without being in a cult. Or higher-level members in Amway being told to divorce their partners if they refuse to buy Amway products is also perfectly normal, because people divorce their partners without a cult, too, and problems related to job often play a role in it.
I never said that cults were using special supernatural methods to manipulate people. Actually, my point is that they just strategically exploit existing flaws in human psychology. Which means that the same things are also seen in action outside cults. What cults do is “merely” using these things strategically, not as a random stuff that sometimes happen, but as a group norm. (If I may use an analogy, it’s like a difference between an optical illusion happening randomly, and someone filling the whole house with optical illusions with the goal to make people lose balance and fall off the stairs.)
If this set of criteria classify Leverage as a cult, they are probably correct to do so; they’re seen as cultish already and I don’t think anyone outside Leverage would be too surprised. There are startups that would be classified as such as well; for many that is accurate.
Scientology did this … about two decades ago.
https://en.wikipedia.org/wiki/Scieno_Sitter
Edited to add: This is presented as an example of how someone might have heard of “cults doing web censorship” as a story, without it being current.