Yes but LessWrong is a lot like this—witness all the discussions in thrilled detail of drugs that put your brain into a more effective/enjoyable state. It’s assumed that the readership is intelligent/responsible enough to handle this sort of thing.
The outside culture has enough warnings about dangers of using drugs that we don’t have to repeat them here. Everybody knows that playing with them can fry your brain, and you should take proper precautions. I don’t think the outside culture has enough warnings about psychological manipulation techniques in general, nor this particular sect. People routinely think they’ll be less influenced than they are.
And there’s also the thing that while the people who hang around at LW probably have more ammo than usual against the overt bullshit of cults, they also might have some traits that make them more susceptible to cult recruitment. Namely, sparse social networks, which makes you vulnerable to a bunch of techniques that create the feeling of belonging and acceptance of the new community, and tolerance of practices and ideas outside the social mainstream, which gets cult belief systems that don’t immediately trigger bullshit warnings inside your head.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
tl;dr: You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
How about a word on the major religions? The most obvious difference between a cult and a religion is that the religion is many orders of magnitude more successful at recruitment—which is the very thing that we are being warned about with respect to cults.
The Mormons are a good comparison. They were dangerous lunatics in the mid-1800s—and Brigham Young was a murderous nutter on a par with David Miscavige. These days, they’re slightly weirdy but very nice (if very, very conservative) people; good neighbours.
You must mean “kill off” metaphorically, since I don’t recall any incidents in which Scientology has killed off Scientologitsts. In contrast I can recall many very recent incidents in which one old religion—Islam—has killed off adherents. But if “kill off” is a metaphor, then what is the literal danger from Scientology which is being referred to metaphorically as “kill off the host”?
I would caution against using “I don’t recall” to mean “I haven’t researched even slightly”.
I used “I don’t recall” to mean “I don’t recall”. Go ahead and bash me for failing to research the question but please don’t put your words and ideas in my writing.
I think David’s point is that when you say “I don’t recall X”, it matters very much whether you would recall an X to begin with, i.e., whether P(“I recall X” | X has happened) is significantly larger than P(“I don’t recall X” | X has happened). So when you offer up “I don’t recall X”, people assume you’re doing it because the former is larger than the latter.
But if that’s not the case, then you are, in effect, using “I don’t recall” to mean “I haven’t researched”, and this is why David was accusing you of blurring the distinction.
No, you’re inventing my meaning on the basis of a convoluted reading, and you’re neglecting the context. What I said was that I do not recall. And that is true. In context, the issue is whether Scientology kills off its host quickly. I pointed out that Islam, which kills many of its own adherents, is classified (by the preceding comment, implicitly) as not killing off its host quickly. Therefore for Scientology to be classified as killing off its host quickly it must kill more of its own adherents than Islam does. So that is the relevant question.
So: how does David’s evidence address this question? Not very well. A woman died from negligence while in the care of co-religionists. This can barely be connected to the religion itself. When I said that Islam kills off many of its own adherents, I did not have in mind adherents dying from negligence while in the care of co-religionists. I had in mind jihad. But if we want to expand the definition of killing one’s own host, let us do so: let us take into account the economic backwardness caused by Islam in the Middle East. That should greatly increase the death toll of Islam. Which does not, by assumption, kill of its host.
So, David’s evidence is hardly pertinent to the question. If we expand the definition of killing of one’s host to accommodate it, then we must do the same for Islam, which makes Islam look very bad indeed.
Now let’s turn to my own evidence. I am an imperfect observer, who is not aware of everything that goes on in the world. But it doesn’t matter whether I am perfect. What matters is whether I’m biased. David says that I did not specially investigate Scientology. No, I didn’t. And also, I didn’t specially investigate Islam. So as an instrument, I am balanced in that respect. And my readout says: I am aware of many dead from Islam, none dead from Scientology. David says I missed one. Oh? And so what? I missed many on the Muslim side too.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
I wish that you were either a more concise or less interesting writer, so that I wouldn’t waste time reading a detailed argument about what’s-been-said.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
That’s one adjustment that needs to be made, though not the only one. The other major adjustment that needs to be made is for proximity. That goes in the opposite direction. But it’s not worthwhile thinking about it with current data—the energy should be spent on getting better data. I just did that for Afghanistan. 38,000 is the most recent figure I found for dead Taliban, who I interpret as seeing themselves as fighting Islamic jihad, seeing as the Taliban is an Islamic theocracy. Divide that by 1000 and you have 38 Scientologists who, going by your figure, need to have died in armed struggle with—I don’t know—the police, maybe, in order for Scientology to match the proportional death toll in religious violence. I’m pretty sure that if 38 Scientologists had died fighting the police, I would have heard about it, even though I didn’t specifically research the question.
And the Taliban is I think just a small part of everyone who died in the last decade in what they considered to be Islamic jihad.
Update—the factor of 1000 is way off. It’s 25,000 Scientologists versus about 1.5 billion Muslims. If, say, 150,000 Muslims have died in armed jihad in the past ten years, then that’s one in 10,000, which comes to two Scientologists who need to die in armed struggle to match proportions. So being a Muslim is probably not significantly more dangerous than being a Scientologist. Further information could reveal that it is less dangerous.
The membership statistics have been lies for decades. alt.religion.scientology worked out it was around 50k in the late 1990s; I’m surprised it’s as high as 25k now.
I agree that David’s point about Lisa McPherson isn’t counterevidence to the claim you made (or rather, were implying based on not recalling). I was replying only to your statement
I used “I don’t recall” to mean “I don’t recall” … please don’t put your words and ideas in my writing
which was ridiculing the very idea that someone would read your “I don’t recall” to mean “I don’t recall and that is informative in this case”, when people have good reason to do so, as I explained.
If your objection to David’s point was that the McPherson case is not evidence of Scientology “killing off its host”, then you should have said so in your reply at that point (and I would have agreed) rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
You have it backwards. I asked someone not to put words into my mouth. There were much better and less rude ways for him to make the same point. I am not going to continue arguing this at length because Jonathan Graehl just said he doesn’t like being forced to read who-said-what.
Please do not use value-laden and unsupported terms such as “murder” here. Yes, there are some cases of controversial deaths involving Scienology, but none of these could be described as murder of either the formal or less-formal sort.
The existence of R2-45 is rather unsettling, but apparently this ‘auditing procedure’ has never been enacted.
Growth/attrition rates are actually the thing to look at here. Scientology is faster-growing than just about any other modern religion, though the attrition rate is also very high. In order to figure out virulency, figure out what population the S-curve of members of that religion will top out at. If growth is slowing, you’re almost there. If growth is steady, you’re about halfway there. If growth is exponential or approximately so, you’re looking at a religion in its infancy.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
Oddly, a “sense of belonging” usually makes me feel alienated and uncomfortable. It’s the rare exceptions like LessWrong, where it actually feels like I do fit, and am being challenged and growing and free to express myself, that avoid that.
I can take a shot at it, having experienced something similar.
The general situation usually follows the pattern of “There is a group with easily-noticeable standards A, B, and C and less-easily-noticeable standards X, Y, and Z. I conform to A, B, and C (though probably for different reasons than they do), but not to (some subset of) X , Y, and Z, but since X and Y and Z don’t come up very often, 1) they haven’t figured out that I don’t fit them, and 2) I didn’t realize that those standards were significant until after I’d been accepted as a member of the group (which is where the ‘sense of belonging’ comes in). At no point did I actually mislead the group with regards to X, Y, or Z, but it’s very likely that if they find out that I don’t conform to them, they will assume that I did and there will be large amounts of drama.”
This usually leads to an inclination to hide facts relating to X, Y, and Z, which feels from the inside like being alienated and uncomfortable.
ETA: This isn’t necessarily something that a person would have to be consciously aware of in order for it to happen, and it can also be based on a person’s assumptions about X/Y/Z-like standards if a given group doesn’t make them explicit.
Adelene’s response strikes me as a similar experience. I should also admit that I’m having a lot of trouble actually getting a concrete description of the experience, as it’s primarily emotional/subconscious, but here’s my own go at it:
I suppose the short version is that while I have the social/emotional response of “belonging and acceptance”, I don’t actually feel safe relaxing and letting down my guard around those groups, which produces a secondary emotional response of feeling alienated and uncomfortable that I have to keep those defenses up.
There are various social behaviors that groups will exhibit to build a very strong “sense of belonging”, and it’s more an emotional evaluation than an intellectual one—although the other part is that I often 99% fit with a group, am clearly a valuable member of the group, and risk getting expelled if I reveal that other 1% of myself.
More specifically, I belong to a few groups where revealing one’s status can still result in fairly sharp social ostracization . Thus, once I’ve found a group where I “belong”, I run in to the choice of risking all of that to be accepted “for who I really am”, or just shutting up and keeping quiet about things that almost never come up anyways.
In the case of LessWrong, I feel safe because the community strikes me as much more likely to be tolerant of these things, because an online community has much less power to hurt me, and because these things are extremely unlikely to come up here to begin with (and, being an online forum, I can devote time to carefully crafting posts not to reveal anything; that’s still annoying, but gets written off as “I don’t want to post publicly about this” rather than “LessWrong is unsafe”)
The other aspect is simply that a lot of standard recruitment/retention techniques trigger a visceral aversion to me, even if I don’t view the group as a threat and genuinely do want to be a member.
I’ve got a streak of that, though of a different flavor. Some types of ceremonial efforts to solidify group cohesion don’t work for me, so I feel alienated from any group where there’s an assumption that I’ll feel good and devoted because of enforced symbolism.
To be less abstract about it all, I’m American, whatever that means. I can be defensive and even mildly jingoistic about America (though I consider the latter a failing)-- but I’d be a lot more comfortable with the place and the identity if it weren’t for all the damned flags.
In other news, I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
I can be defensive and even mildly jingoistic about America
The US has one of the most effective brainwashing systems in the (first) world, patriotism-wise. I suspect that a part of it is the historical narrative of a real or imagined success against formidable odds, all in the last 200 years or so. The message “America is great” is also constantly all over the school system and the media. This is really hard to resist, no matter how often you repeat to yourself “I ought to keep my identity small”.
I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
I heard that sentiment many times, not necessarily from people of Jewish descent, although the latter are an easy example. Jews in the early 20th century Germany thought if themselves as Germans, until “real Germans” disabused them of that notion in 1930s. Same happened in Russia in 1950s. Various Yugoslavian ethnicities suddenly realized in 1990s that they were not just Yugoslavians, but Serbs, Albanians, Croatians etc., and those who did not were quickly and forcefully reminded of it by their neighbors.
I somewhat relate to his comment, and for me it’s because of how much persona, holding myself back, and not letting myself go it requires to be accepted by others. When, and if, it actually does work, it feels like here all I was trying to do was be a nice guy, and now the ruse worked? Now it’s like you’ve committed yourself to it.
“You probably have a minor mood disorder from lack of satisfactory social interaction” seems like a rather harsh description of the members of this community. What data generated that thought?
I agree with the description. Why? Because the joy people describe at going to the meetups seems out of proportion to what goes on in the meetups—unless, as the old saying goes, hunger is the best spice.
I started with the assumption that most people posting here live alone or with a small immediate family and occasional interaction with acquaintances instead of as a part of a tightly knit tribe of some dozens of people who share their values and whom they have constant social interaction with. Then thought what the probable bias for site members to belong into a mainstream society tribe-equivalents like churches, sports fan groups, gangs or political organizations was.
The “mood disorder” thing is hyperbole for “your brain would like to be in a more tribe-like social environment than it is in now”, not an attempt at a clinical diagnosis.
You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
This is an important point. If you do mess with cults, start with the more innocuous ones before you face the heavy guns. Make sure you can resist the community in an average church before you test yourself against Scientology.
One of the impressive things about Sufism (at least as described by Idris Shah) is that they wouldn’t take people as students who didn’t already have social lives.
He might not. But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
(I’m not sure if you want to say something extra here by quoting a thing that was described as the “second most dangerous dark side meme” in the linked comment.)
But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
I wonder about this idea that knowing how someone will be manipulating you is any defense at all from being manipulated by that person. It sounds plausible, but is there any evidence at all that knowledge can have this affect?
Or is knowledge not wholly intellectual, and can be considered a species of manipulation, but not manipulation of the dark arts variety. Maybe even “light arts manipulation”? Sorry, had to throw this last paragraph in there because I thought it was interesting.
Compare “I’ve only known this guy for half an hour, but he seems really likable” and “I’ve only known this guy for half an hour, he’s been running through the tricks from the cult salesman playbook and is giving off a real likable impression at this point”.
You still need to have your own head game in order to actually counteract the subconscious impressions you are getting, but it will probably help to know that a contest is even happening.
I think what you say is plausible. But I also think that it is also plausible that a “likable impression” isn’t just an appearance, but the effect of you actually starting to like the guy. I think that’s the sort of thing that concerns me, that at a certain point our social instincts take over and we lose the ability to detach ourselves from the situation.
That’s a valid point. Women who have read about the pickup artist techniques report that the techniques still work on them even when they’re aware the person is using them. On the other hand, SWIM says that being aware of various techniques has helped him guard against HR methods on the basis of “Oh, now he’s moving into stage x, next he’s going to...”. SWIM would say that it depends to what degree you’re predisposed against the person using them.
Be aware that some techinques are more obvious than others. Some are really obvious when you know they exist, but also really obscure, so you won’t know they’re being used unless you’ve read about it before.
Interesting. My intuition and experience say this is screamingly, overtly incorrect. The fact that yours do not means I’m probably wrong—either about the ‘overtly’ or the ‘incorrect’!
Arguably, Internet culture has a tremendous amount of information on the dangers of Scientology in particular. (And I’m one of the people who put it there personally.) But you are entirely correct: people are convinced they’re much less manipulable than they are. I need to write something for LW on the subject (as I’ve been idly contemplating doing for about 6 months).
I would think the easiest method, albeit not terribly objective, would simply be to get someone who is fairly good at manipulation and play out scenarios with them. I’ve done this a few times as the manipulator, and it’s sort of scary how easily I can manipulate people in specific games, even when they know the rules and have witnessed some of my techniques.
If you do try it, I’ll comment that time and social pressure help me a lot in making people more pliable, too. I do these as a group exercise, so there’s a lot of peer pressure both to perform well, and not to use exactly the sort of “cheats” you should be using to resist manipulation. It’s also helped that I’ve always known the group and thus known how to tweak myself to hit specific weaknesses.
If you find something more useful than this, I’d love to hear it. I’ve merely learned I’m fairly good at manipulating—I have no clue how good I am at resisting :)
Having not tested them, I wouldn’t be sure. I tend to do best with people who are either following an easily inferred pattern (office workers, security, etc.) or people who I know personally, which would make it harder to do with someone I don’t know. You also are neither “disposable” (someone I’ll never deal with again) nor a friend, which adds a bit of social awkwardness.
Given that’s an entire paragraph of excuses, I suppose I should offer to try it anyway, if you want :)
For anyone wondering how this went, handoflixue failed to manipulate me into anything, in fact most of the successful manipulating was the other way around :-)
I have occasionally seen quizzes that purport to tell you how biased you are in purportedly relevant ways to cult susceptibility. I can’t say I found any of them revelatory, as, since you know what the test is testing, it’s way too easy to answer with the right answer rather than the true readout, even when you want the latter. I suppose proper testing would have to be similar to psychological measures of cognitive biases.
The outside culture has enough warnings about dangers of using drugs that we don’t have to repeat them here. Everybody knows that playing with them can fry your brain, and you should take proper precautions. I don’t think the outside culture has enough warnings about psychological manipulation techniques in general, nor this particular sect. People routinely think they’ll be less influenced than they are.
And there’s also the thing that while the people who hang around at LW probably have more ammo than usual against the overt bullshit of cults, they also might have some traits that make them more susceptible to cult recruitment. Namely, sparse social networks, which makes you vulnerable to a bunch of techniques that create the feeling of belonging and acceptance of the new community, and tolerance of practices and ideas outside the social mainstream, which gets cult belief systems that don’t immediately trigger bullshit warnings inside your head.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
tl;dr: You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
How about a word on the major religions? The most obvious difference between a cult and a religion is that the religion is many orders of magnitude more successful at recruitment—which is the very thing that we are being warned about with respect to cults.
Parasite species that have been around a long time have mostly evolved not to kill their host very fast. With new species, all bets are off.
The Mormons are a good comparison. They were dangerous lunatics in the mid-1800s—and Brigham Young was a murderous nutter on a par with David Miscavige. These days, they’re slightly weirdy but very nice (if very, very conservative) people; good neighbours.
You must mean “kill off” metaphorically, since I don’t recall any incidents in which Scientology has killed off Scientologitsts. In contrast I can recall many very recent incidents in which one old religion—Islam—has killed off adherents. But if “kill off” is a metaphor, then what is the literal danger from Scientology which is being referred to metaphorically as “kill off the host”?
http://en.wikipedia.org/wiki/Death_of_Lisa_McPherson—and she was hardly the first.
I would caution against using “I don’t recall” to mean “I haven’t researched even slightly”.
I used “I don’t recall” to mean “I don’t recall”. Go ahead and bash me for failing to research the question but please don’t put your words and ideas in my writing.
I think David’s point is that when you say “I don’t recall X”, it matters very much whether you would recall an X to begin with, i.e., whether P(“I recall X” | X has happened) is significantly larger than P(“I don’t recall X” | X has happened). So when you offer up “I don’t recall X”, people assume you’re doing it because the former is larger than the latter.
But if that’s not the case, then you are, in effect, using “I don’t recall” to mean “I haven’t researched”, and this is why David was accusing you of blurring the distinction.
No, you’re inventing my meaning on the basis of a convoluted reading, and you’re neglecting the context. What I said was that I do not recall. And that is true. In context, the issue is whether Scientology kills off its host quickly. I pointed out that Islam, which kills many of its own adherents, is classified (by the preceding comment, implicitly) as not killing off its host quickly. Therefore for Scientology to be classified as killing off its host quickly it must kill more of its own adherents than Islam does. So that is the relevant question.
So: how does David’s evidence address this question? Not very well. A woman died from negligence while in the care of co-religionists. This can barely be connected to the religion itself. When I said that Islam kills off many of its own adherents, I did not have in mind adherents dying from negligence while in the care of co-religionists. I had in mind jihad. But if we want to expand the definition of killing one’s own host, let us do so: let us take into account the economic backwardness caused by Islam in the Middle East. That should greatly increase the death toll of Islam. Which does not, by assumption, kill of its host.
So, David’s evidence is hardly pertinent to the question. If we expand the definition of killing of one’s host to accommodate it, then we must do the same for Islam, which makes Islam look very bad indeed.
Now let’s turn to my own evidence. I am an imperfect observer, who is not aware of everything that goes on in the world. But it doesn’t matter whether I am perfect. What matters is whether I’m biased. David says that I did not specially investigate Scientology. No, I didn’t. And also, I didn’t specially investigate Islam. So as an instrument, I am balanced in that respect. And my readout says: I am aware of many dead from Islam, none dead from Scientology. David says I missed one. Oh? And so what? I missed many on the Muslim side too.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
I wish that you were either a more concise or less interesting writer, so that I wouldn’t waste time reading a detailed argument about what’s-been-said.
That’s one adjustment that needs to be made, though not the only one. The other major adjustment that needs to be made is for proximity. That goes in the opposite direction. But it’s not worthwhile thinking about it with current data—the energy should be spent on getting better data. I just did that for Afghanistan. 38,000 is the most recent figure I found for dead Taliban, who I interpret as seeing themselves as fighting Islamic jihad, seeing as the Taliban is an Islamic theocracy. Divide that by 1000 and you have 38 Scientologists who, going by your figure, need to have died in armed struggle with—I don’t know—the police, maybe, in order for Scientology to match the proportional death toll in religious violence. I’m pretty sure that if 38 Scientologists had died fighting the police, I would have heard about it, even though I didn’t specifically research the question.
And the Taliban is I think just a small part of everyone who died in the last decade in what they considered to be Islamic jihad.
Update—the factor of 1000 is way off. It’s 25,000 Scientologists versus about 1.5 billion Muslims. If, say, 150,000 Muslims have died in armed jihad in the past ten years, then that’s one in 10,000, which comes to two Scientologists who need to die in armed struggle to match proportions. So being a Muslim is probably not significantly more dangerous than being a Scientologist. Further information could reveal that it is less dangerous.
I also assumed there were far more than 25k Scientologists.
Yeah, I’m surprised too. I’m basing 25k on this.
The membership statistics have been lies for decades. alt.religion.scientology worked out it was around 50k in the late 1990s; I’m surprised it’s as high as 25k now.
I agree that David’s point about Lisa McPherson isn’t counterevidence to the claim you made (or rather, were implying based on not recalling). I was replying only to your statement
which was ridiculing the very idea that someone would read your “I don’t recall” to mean “I don’t recall and that is informative in this case”, when people have good reason to do so, as I explained.
If your objection to David’s point was that the McPherson case is not evidence of Scientology “killing off its host”, then you should have said so in your reply at that point (and I would have agreed) rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
You have it backwards. I asked someone not to put words into my mouth. There were much better and less rude ways for him to make the same point. I am not going to continue arguing this at length because Jonathan Graehl just said he doesn’t like being forced to read who-said-what.
Ruin their life or mess them up mentally.
Check out Auditing Procedure R2-45. There are also a number of less formal murders attributed to them. Ask Google for “Scientology Murder”.
Please do not use value-laden and unsupported terms such as “murder” here. Yes, there are some cases of controversial deaths involving Scienology, but none of these could be described as murder of either the formal or less-formal sort.
The existence of R2-45 is rather unsettling, but apparently this ‘auditing procedure’ has never been enacted.
Okay, edited to use the less value-laden term “exteriorization”.
Growth/attrition rates are actually the thing to look at here. Scientology is faster-growing than just about any other modern religion, though the attrition rate is also very high. In order to figure out virulency, figure out what population the S-curve of members of that religion will top out at. If growth is slowing, you’re almost there. If growth is steady, you’re about halfway there. If growth is exponential or approximately so, you’re looking at a religion in its infancy.
This has of course been covered here before (with reference to this and this).
Umm. Not all of us. I may be vulnerable to cults for other reasons, namely my conformist personality, but not lack of people to talk to.
Oddly, a “sense of belonging” usually makes me feel alienated and uncomfortable. It’s the rare exceptions like LessWrong, where it actually feels like I do fit, and am being challenged and growing and free to express myself, that avoid that.
This sounds very odd. In fact, it sounds oxymoronic. Can you explain?
I can take a shot at it, having experienced something similar.
The general situation usually follows the pattern of “There is a group with easily-noticeable standards A, B, and C and less-easily-noticeable standards X, Y, and Z. I conform to A, B, and C (though probably for different reasons than they do), but not to (some subset of) X , Y, and Z, but since X and Y and Z don’t come up very often, 1) they haven’t figured out that I don’t fit them, and 2) I didn’t realize that those standards were significant until after I’d been accepted as a member of the group (which is where the ‘sense of belonging’ comes in). At no point did I actually mislead the group with regards to X, Y, or Z, but it’s very likely that if they find out that I don’t conform to them, they will assume that I did and there will be large amounts of drama.”
This usually leads to an inclination to hide facts relating to X, Y, and Z, which feels from the inside like being alienated and uncomfortable.
ETA: This isn’t necessarily something that a person would have to be consciously aware of in order for it to happen, and it can also be based on a person’s assumptions about X/Y/Z-like standards if a given group doesn’t make them explicit.
Adelene’s response strikes me as a similar experience. I should also admit that I’m having a lot of trouble actually getting a concrete description of the experience, as it’s primarily emotional/subconscious, but here’s my own go at it:
I suppose the short version is that while I have the social/emotional response of “belonging and acceptance”, I don’t actually feel safe relaxing and letting down my guard around those groups, which produces a secondary emotional response of feeling alienated and uncomfortable that I have to keep those defenses up.
There are various social behaviors that groups will exhibit to build a very strong “sense of belonging”, and it’s more an emotional evaluation than an intellectual one—although the other part is that I often 99% fit with a group, am clearly a valuable member of the group, and risk getting expelled if I reveal that other 1% of myself.
More specifically, I belong to a few groups where revealing one’s status can still result in fairly sharp social ostracization . Thus, once I’ve found a group where I “belong”, I run in to the choice of risking all of that to be accepted “for who I really am”, or just shutting up and keeping quiet about things that almost never come up anyways.
In the case of LessWrong, I feel safe because the community strikes me as much more likely to be tolerant of these things, because an online community has much less power to hurt me, and because these things are extremely unlikely to come up here to begin with (and, being an online forum, I can devote time to carefully crafting posts not to reveal anything; that’s still annoying, but gets written off as “I don’t want to post publicly about this” rather than “LessWrong is unsafe”)
The other aspect is simply that a lot of standard recruitment/retention techniques trigger a visceral aversion to me, even if I don’t view the group as a threat and genuinely do want to be a member.
I’ve got a streak of that, though of a different flavor. Some types of ceremonial efforts to solidify group cohesion don’t work for me, so I feel alienated from any group where there’s an assumption that I’ll feel good and devoted because of enforced symbolism.
To be less abstract about it all, I’m American, whatever that means. I can be defensive and even mildly jingoistic about America (though I consider the latter a failing)-- but I’d be a lot more comfortable with the place and the identity if it weren’t for all the damned flags.
In other news, I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
The US has one of the most effective brainwashing systems in the (first) world, patriotism-wise. I suspect that a part of it is the historical narrative of a real or imagined success against formidable odds, all in the last 200 years or so. The message “America is great” is also constantly all over the school system and the media. This is really hard to resist, no matter how often you repeat to yourself “I ought to keep my identity small”.
I heard that sentiment many times, not necessarily from people of Jewish descent, although the latter are an easy example. Jews in the early 20th century Germany thought if themselves as Germans, until “real Germans” disabused them of that notion in 1930s. Same happened in Russia in 1950s. Various Yugoslavian ethnicities suddenly realized in 1990s that they were not just Yugoslavians, but Serbs, Albanians, Croatians etc., and those who did not were quickly and forcefully reminded of it by their neighbors.
I somewhat relate to his comment, and for me it’s because of how much persona, holding myself back, and not letting myself go it requires to be accepted by others. When, and if, it actually does work, it feels like here all I was trying to do was be a nice guy, and now the ruse worked? Now it’s like you’ve committed yourself to it.
“You probably have a minor mood disorder from lack of satisfactory social interaction” seems like a rather harsh description of the members of this community. What data generated that thought?
I agree with the description. Why? Because the joy people describe at going to the meetups seems out of proportion to what goes on in the meetups—unless, as the old saying goes, hunger is the best spice.
I started with the assumption that most people posting here live alone or with a small immediate family and occasional interaction with acquaintances instead of as a part of a tightly knit tribe of some dozens of people who share their values and whom they have constant social interaction with. Then thought what the probable bias for site members to belong into a mainstream society tribe-equivalents like churches, sports fan groups, gangs or political organizations was.
The “mood disorder” thing is hyperbole for “your brain would like to be in a more tribe-like social environment than it is in now”, not an attempt at a clinical diagnosis.
This is an important point. If you do mess with cults, start with the more innocuous ones before you face the heavy guns. Make sure you can resist the community in an average church before you test yourself against Scientology.
One of the impressive things about Sufism (at least as described by Idris Shah) is that they wouldn’t take people as students who didn’t already have social lives.
In other words “don’t try to argue with the devil^H^H Scientologist—he has more experience at it than you”.
He might not. But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
(I’m not sure if you want to say something extra here by quoting a thing that was described as the “second most dangerous dark side meme” in the linked comment.)
I do believe you’ve nailed it. Well done, sir.
I wonder about this idea that knowing how someone will be manipulating you is any defense at all from being manipulated by that person. It sounds plausible, but is there any evidence at all that knowledge can have this affect?
Or is knowledge not wholly intellectual, and can be considered a species of manipulation, but not manipulation of the dark arts variety. Maybe even “light arts manipulation”? Sorry, had to throw this last paragraph in there because I thought it was interesting.
Compare “I’ve only known this guy for half an hour, but he seems really likable” and “I’ve only known this guy for half an hour, he’s been running through the tricks from the cult salesman playbook and is giving off a real likable impression at this point”.
You still need to have your own head game in order to actually counteract the subconscious impressions you are getting, but it will probably help to know that a contest is even happening.
I think what you say is plausible. But I also think that it is also plausible that a “likable impression” isn’t just an appearance, but the effect of you actually starting to like the guy. I think that’s the sort of thing that concerns me, that at a certain point our social instincts take over and we lose the ability to detach ourselves from the situation.
That’s a valid point. Women who have read about the pickup artist techniques report that the techniques still work on them even when they’re aware the person is using them. On the other hand, SWIM says that being aware of various techniques has helped him guard against HR methods on the basis of “Oh, now he’s moving into stage x, next he’s going to...”. SWIM would say that it depends to what degree you’re predisposed against the person using them.
Be aware that some techinques are more obvious than others. Some are really obvious when you know they exist, but also really obscure, so you won’t know they’re being used unless you’ve read about it before.
Interesting. My intuition and experience say this is screamingly, overtly incorrect. The fact that yours do not means I’m probably wrong—either about the ‘overtly’ or the ‘incorrect’!
Arguably, Internet culture has a tremendous amount of information on the dangers of Scientology in particular. (And I’m one of the people who put it there personally.) But you are entirely correct: people are convinced they’re much less manipulable than they are. I need to write something for LW on the subject (as I’ve been idly contemplating doing for about 6 months).
Do you know of any techniques to measure your own manipulability somewhat objectively?
I would think the easiest method, albeit not terribly objective, would simply be to get someone who is fairly good at manipulation and play out scenarios with them. I’ve done this a few times as the manipulator, and it’s sort of scary how easily I can manipulate people in specific games, even when they know the rules and have witnessed some of my techniques.
If you do try it, I’ll comment that time and social pressure help me a lot in making people more pliable, too. I do these as a group exercise, so there’s a lot of peer pressure both to perform well, and not to use exactly the sort of “cheats” you should be using to resist manipulation. It’s also helped that I’ve always known the group and thus known how to tweak myself to hit specific weaknesses.
If you find something more useful than this, I’d love to hear it. I’ve merely learned I’m fairly good at manipulating—I have no clue how good I am at resisting :)
That reminds me of a bit from a book about art forgery—that need, greed, and speed make people more gullible.
I’d love to try this (being the manipulatee). Do your mind tricks work over Skype?
Having not tested them, I wouldn’t be sure. I tend to do best with people who are either following an easily inferred pattern (office workers, security, etc.) or people who I know personally, which would make it harder to do with someone I don’t know. You also are neither “disposable” (someone I’ll never deal with again) nor a friend, which adds a bit of social awkwardness.
Given that’s an entire paragraph of excuses, I suppose I should offer to try it anyway, if you want :)
Good! How about Sunday evening (CEST)?
CEST = UTC+2, correct? I’m PST (UTC-7), so that’d put you 9 hours ahead of me.
My 10 AM would be your 7 PM—would that work for you? I can do a couple hours later if not.
EDIT: I’m assuming Sunday, May 1st, 2011. If you could send me your Skype name that will probably also make this much easier :)
For anyone wondering how this went, handoflixue failed to manipulate me into anything, in fact most of the successful manipulating was the other way around :-)
I have occasionally seen quizzes that purport to tell you how biased you are in purportedly relevant ways to cult susceptibility. I can’t say I found any of them revelatory, as, since you know what the test is testing, it’s way too easy to answer with the right answer rather than the true readout, even when you want the latter. I suppose proper testing would have to be similar to psychological measures of cognitive biases.