Imagine some horrible person wants to start a cult. So they get a bunch of people together and survey them asking things like:
“I don’t think that cults are a good thing.”
“I’m not completely sure that (horrible person) would be a good cult leader.”
and switches them with:
“I think that cults are a good thing.”
“I’m completely sure that (horrible person) would be a good cult leader.”
And the horrible person shows the whole room the results of the second set of questions, showing that there’s a consensus that cults are a good thing and most people are completely sure that (horrible person) would be a good cult leader.
Then the horrible person asks individuals to support their conclusions about why cults are a good thing and why they would be a good leader.
Then the horrible person starts asking for donations and commitments, etc.
Who do we tell about these things? They have organizations for reporting security vulnerabilities for computer systems so the professionals get them… where do you report security vulnerabilities for the human mind?
Is you start a cult you don’t tell people that you start a cult. You tell them:
Look there this nice meetup. All the people in that meetup are cool. The people in that group think differently than the rest of the world. They are better.
Then there are those retreats where people spents a lot of time together and become even better and more different than the average person on the street.
Most people in the LessWrong community don’t see it as a cult, and the same is true for most organisations that are seen as cults.
Is you start a cult you don’t tell people that you start a cult. You tell them: Look there this nice meetup. All the people in that meetup are cool. The people in that group think differently than the rest of the world. They are better. Then there are those retreats where people spents a lot of time together and become even better and more different than the average person on the street.
Do you? Really? That works? When creating an actual literal cult? This is counter-intuitive.
The trick: you need to spin it as something they’d like to do anyway… you can’t just present it as a way to be cool and different, you need to tie it into an existing motivation. Making money is an easy one, because then you can come in with an MLM structure, and get your cultists to go recruiting for you. You don’t even need to do much in the way of developing cultic materials; there’s plenty of stuff designed to indoctrinate people in anti-rational pro-cult philosophies like “the law of attraction” that are written in a way so as to appear as guides for salespeople, so your prospective cultists will pay for and perform their own indoctrination voluntarily.
I was in such a cult myself; it’s tremendously effective.
If you want to reach a person who feels lonely having a community of like minded people who accept the person can be enough. You don’t necessarily need stuff like money.
Agreed. Emotional motivations make just as good a target as intellectual ones. If someone already feels lonely and isolated, then they have a generally exploitable motivation, making them a prime candidate for any sort of cult recruitment. That kind of isolation is just what cults look for in a recruit, and most try to create it intentionally, using whatever they can to cut their cultists off from any anti-cult influences in their lives.
It works. Especially if you can get people away from their other social contacts. Mix in insufficient sleep and a low protein diet, and it works really well. (Second-hand information, but there’s pretty good consensus on how cults work.)
I’d question “really well”. Cult retention rates tend to be really low—about 2% for Sun Myung Moon’s Unification Church (“Moonies”) over three to five years, for example, or somewhere in the neighborhood of 10% for Scientology. The cult methodology seems to work well in the short term and on vulnerable people, but it seriously lacks staying power: one reason why many cults focus so heavily on recruiting, as they need to recruit massively just to keep up their numbers.
Judging from the statistics here, retention rates for conventional religious conversions are much higher than this (albeit lower than retention rates for those raised in the church).
Note that the term cult is a worst argument in the world (guilt by association). The neutral term is NRM. Thus to classify something as a cult one should first tick off the “religious” check mark, which requires spirituality, a rather nebulous concept:
Spirituality is the concept of an ultimate or an alleged immaterial reality; an inner path enabling a person to discover the essence of his/her being; or the “deepest values and meanings by which people live.
If you define cult as an NRM with negative connotations, then you have to agree on what those negatives are, not an easy task.
“NRM” is a term in the sociology of religion. There are many groups that are often thought of as “cultish” in the ordinary-language sense that are not particularly spiritual. Multi-level marketing groups and large group awareness training come to mind.
This is basically true, although I had a dickens of a time finding specifics in the religious/psychology/sociological research—everyone is happy to claim that cults have horrible retention rates, but none of them seem to present much beyond anecdotes.
I’ll confess I was using remembered statistics for the Moonies, not fresh ones. The data I remember from a couple of years ago seems to have been rendered unGooglable by the news of Sun Myung Moon’s death.
Scientology is easier to find fresh statistics for, but harder to find consistent statistics for. I personally suspect the correct value is lower, but 10% is about the median in easily accessible sources.
Like what you say but not much like ChristianKI said. I think he was exaggerating rather a lot to try to make something fit when it doesn’t particularly.
When I went to the Quantified Self conference in Amsterdam last year, I heard the allegation that Quantified Self is a cult after I explained it to someone who lived at the place I stayed for the weekend.
I also had to defend against the cult allegation when explain the Quantified Self community to journalists.
Which groups are cults depends a lot of the person who’s making the judgement.
There are however also groups where we can agree that they are cults. I would say that the principle applies to an organisation like the Church of Scientology.
I think that’s known as voter fraud. A lot of people believe (and tell others to believe) that certain candidates were legally and fairly elected even when exit polls show dramatically different results. Although of course this could work the same way if exit polls were changed to reflect the opposite outcome of an actually fair election and people believed the false exit polls and demanded a recount or re-election. It just depends on which side can effectively collude to cheat.
No. What I’m saying here is that, using this technique, it might not be seen as fraud.
If the view on “choice blindness” is that people are actually changing their opinions, it would not be technically seen as false to claim that those are their opinions. Committing fraud would require you to lie. This may be a form of brainwashing, not a new way to lie.
We need a worldwide Mindhacker Convention/Summit/Place-where-people-go.
Unfortunately, the cult leaders you’ve just described will not permit this, because they’ve already brainwashed their minions (and those minions’ children, and those childrens’ children, for thousands of years) into accepting that the human mind is supreme and sacred and must not be toyed with at any cost.
Dark Tactic:
This one makes me sick to my stomach.
Imagine some horrible person wants to start a cult. So they get a bunch of people together and survey them asking things like:
“I don’t think that cults are a good thing.” “I’m not completely sure that (horrible person) would be a good cult leader.”
and switches them with:
“I think that cults are a good thing.” “I’m completely sure that (horrible person) would be a good cult leader.”
And the horrible person shows the whole room the results of the second set of questions, showing that there’s a consensus that cults are a good thing and most people are completely sure that (horrible person) would be a good cult leader.
Then the horrible person asks individuals to support their conclusions about why cults are a good thing and why they would be a good leader.
Then the horrible person starts asking for donations and commitments, etc.
Who do we tell about these things? They have organizations for reporting security vulnerabilities for computer systems so the professionals get them… where do you report security vulnerabilities for the human mind?
Is you start a cult you don’t tell people that you start a cult. You tell them: Look there this nice meetup. All the people in that meetup are cool. The people in that group think differently than the rest of the world. They are better. Then there are those retreats where people spents a lot of time together and become even better and more different than the average person on the street.
Most people in the LessWrong community don’t see it as a cult, and the same is true for most organisations that are seen as cults.
That’s not too different from the description of a university though.
Do you? Really? That works? When creating an actual literal cult? This is counter-intuitive.
The trick: you need to spin it as something they’d like to do anyway… you can’t just present it as a way to be cool and different, you need to tie it into an existing motivation. Making money is an easy one, because then you can come in with an MLM structure, and get your cultists to go recruiting for you. You don’t even need to do much in the way of developing cultic materials; there’s plenty of stuff designed to indoctrinate people in anti-rational pro-cult philosophies like “the law of attraction” that are written in a way so as to appear as guides for salespeople, so your prospective cultists will pay for and perform their own indoctrination voluntarily.
I was in such a cult myself; it’s tremendously effective.
If you want to reach a person who feels lonely having a community of like minded people who accept the person can be enough. You don’t necessarily need stuff like money.
Agreed. Emotional motivations make just as good a target as intellectual ones. If someone already feels lonely and isolated, then they have a generally exploitable motivation, making them a prime candidate for any sort of cult recruitment. That kind of isolation is just what cults look for in a recruit, and most try to create it intentionally, using whatever they can to cut their cultists off from any anti-cult influences in their lives.
Agree, except I’d strengthen this to “a much better”.
It works. Especially if you can get people away from their other social contacts. Mix in insufficient sleep and a low protein diet, and it works really well. (Second-hand information, but there’s pretty good consensus on how cults work.)
How do you think cults work?
I’d question “really well”. Cult retention rates tend to be really low—about 2% for Sun Myung Moon’s Unification Church (“Moonies”) over three to five years, for example, or somewhere in the neighborhood of 10% for Scientology. The cult methodology seems to work well in the short term and on vulnerable people, but it seriously lacks staying power: one reason why many cults focus so heavily on recruiting, as they need to recruit massively just to keep up their numbers.
Judging from the statistics here, retention rates for conventional religious conversions are much higher than this (albeit lower than retention rates for those raised in the church).
I guess “really well” is ill-defined, but I do think that both Sun Myung Moon and L. Ron Hubbard could say “It’s a living”.
You can get a lot out of people in the three to five years before they leave.
Note that the term cult is a worst argument in the world (guilt by association). The neutral term is NRM. Thus to classify something as a cult one should first tick off the “religious” check mark, which requires spirituality, a rather nebulous concept:
If you define cult as an NRM with negative connotations, then you have to agree on what those negatives are, not an easy task.
“NRM” is a term in the sociology of religion. There are many groups that are often thought of as “cultish” in the ordinary-language sense that are not particularly spiritual. Multi-level marketing groups and large group awareness training come to mind.
This is basically true, although I had a dickens of a time finding specifics in the religious/psychology/sociological research—everyone is happy to claim that cults have horrible retention rates, but none of them seem to present much beyond anecdotes.
I’ll confess I was using remembered statistics for the Moonies, not fresh ones. The data I remember from a couple of years ago seems to have been rendered unGooglable by the news of Sun Myung Moon’s death.
Scientology is easier to find fresh statistics for, but harder to find consistent statistics for. I personally suspect the correct value is lower, but 10% is about the median in easily accessible sources.
Click on “Search tools” at the bottom of the menu on the left side of Google’s search results page, then on “Custom range”.
Like what you say but not much like ChristianKI said. I think he was exaggerating rather a lot to try to make something fit when it doesn’t particularly.
What’s an actual literal cult?
When I went to the Quantified Self conference in Amsterdam last year, I heard the allegation that Quantified Self is a cult after I explained it to someone who lived at the place I stayed for the weekend. I also had to defend against the cult allegation when explain the Quantified Self community to journalists. Which groups are cults depends a lot of the person who’s making the judgement.
There are however also groups where we can agree that they are cults. I would say that the principle applies to an organisation like the Church of Scientology.
I think that’s known as voter fraud. A lot of people believe (and tell others to believe) that certain candidates were legally and fairly elected even when exit polls show dramatically different results. Although of course this could work the same way if exit polls were changed to reflect the opposite outcome of an actually fair election and people believed the false exit polls and demanded a recount or re-election. It just depends on which side can effectively collude to cheat.
No. What I’m saying here is that, using this technique, it might not be seen as fraud.
If the view on “choice blindness” is that people are actually changing their opinions, it would not be technically seen as false to claim that those are their opinions. Committing fraud would require you to lie. This may be a form of brainwashing, not a new way to lie.
That’s why this is so creepy.
We need a worldwide Mindhacker Convention/Summit/Place-where-people-go.
Unfortunately, the cult leaders you’ve just described will not permit this, because they’ve already brainwashed their minions (and those minions’ children, and those childrens’ children, for thousands of years) into accepting that the human mind is supreme and sacred and must not be toyed with at any cost.