I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult. Or why they so desperately (and publicly!) reject the prospect that anyone might see them as having a leader. I mean, I understand that there’s a normative component to both, but I don’t get where the sheer power of this fear comes from. It’s probably an important aspect of our lack of coordination, but I don’t understand it.
There are some good reasons for being terrified. We are tribal animals. We don’t really care about the truth as such, but we care a lot about tribal politics. We can pursue truth when we have very high degree of disinterest in what the truth will be, but that’s a really exceptional situation. When we care about the shape of truth, we lose a lot of rationality points, and tribal politics forces make us care a lot about things different than truth. It might be our strongest instinct, even stronger than individual survival or sex drive.
I agree with you that it has its downsides, but I really don’t see how you can accept the politics and stay rational. I cannot think of many examples of that.
I’m also really disappointed by so many status indicators all over lesswrong—top contributors on every page. Your social status points (karma) on every page. User names and points on absolutely everything. Vote up / vote down. You might think we’re doing fine, but so was reddit when it was tiny, let’s see how it scales up. I think we should get rid of as much of that as we can. reddit’s quality of discussion is a lot lower than 4chan’s, even though it’s much smaller.
And this is a great example of what you once posted about—different people are annoyed by different biases. You seem to think social status and politics are mostly harmless and may even be useful, I think it’s the worst poison for clear rational thinking, and I haven’t seen many convincing examples of it being useful.
Well I don’t know that I’ve got any “rationalist” cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people’s capacity for thinking straightforwardly about reality. (And I could easily lump “religion” in with “cult” here in that regard).
Basically, I don’t like the way things I’d call “cultish” seem to disconnect people from concrete reality in favor of abstractions. I’ve seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at “indoctrination” into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I’m terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what’s actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don’t give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don’t want anyone messing with my mind.
It was some kind of “neurolinguistic programming” thing. This particular incarnation of it entailed my first being yelled at until “[my] defenses were stripped away”, at which point I was supposed to accept this guy as a “master”. Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that “felt like torture” but which they’d “come to appreciate”.
I didn’t go to any camp sessions and probably wouldn’t have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
That sounds more like est or Landmark/Forum or even Scientology… but nonetheless a LGAT (large-group awareness training—basically a synonym for cult indoctrination).
Legitimate NLP training doesn’t involve students getting yelled at, even offhandedly, let alone in any sort of systematic way. Anybody who claims to be teaching NLP in such a fashion needs to be reported to the organization that issued their certification, and then to the Society of NLP (so the organization’s trainer-training certification can be revoked, if they don’t revoke the trainer’s cert).
(That link goes to a particular training organization, but I don’t have any connection to them or offer any particular endorsement; it’s just a page with good buyers’ guidelines for ANY sort of training, let alone NLP. I’d also add that a legitimate NLP trainer will generally have enough work teaching paying customers, to have neither time nor reason to subject people to unsolicited “training”.)
I don’t get where the sheer power of this fear comes from.
Status/self-image fears are among the most powerful human fears… and the status-behavior link is learned. (In my work, I routinely help people shed these sorts of fears, as they’re a prominent source of irrationality, stress, procrastination… you name it.)
Basically, you experience one or more situations (most often just one) where a particular behavior pattern is linked to shaming, ridicule, rejection, or some other basic social negative reinforcer. It doesn’t even have to happen to the person directly; it can just be an observation of the response to someone else’s behavior. Under stress, the person then makes a snap judgment as to what the causes of the situation were, and learns to do TWO things:
To internalize the same response to themselves if they express that behavior, and
To have the same response to others having that behavior.
It also works in reverse—if somebody does something bad to you, you learn to direct anger or attempts at ridicule towards that behavior, and also against yourself, as a result of “judging” the behavior itself to be bad, and the marker of a specific social group or class of people.
This can then manifest in odd ways, like not wanting to exhibit behaviors that would mark you as a member of the group you dislike.
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a “dupe” pulled into a “scam” and “cult” situation. Essentially, if you have learned that some group you scorn (e.g. “suckers” or “fools” or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself… although it usually happens at a young enough age and under stressful enough conditions that you weren’t thinking very clearly at the time.
But once you’ve examined the actual evidence used, it’s possible to let go of the judgments involved, and then the feelings go away.
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult.
For one thing, it would mean that they’ve been wearing a clown suit for years – and a sort of clown suit that a large part of their identity is defined in opposition to. How humiliating is that?
Ditto fear of being scammed by cryonics, which people seem to regularly treat as the worst thing that could possibly happen. Bad not to conform in belief, worse to be (exposed as) a nonconforming exploitable moron.
Note that hindsight bias can be expected to make being scammed/joining a cult look more moronic than it actually was, and the fundamental attribution error can be expected to make this reflect more badly on the actor than it should.
For Americans (and the cryonics organizations are American) some special factors apply. David Brin has some nice discussion of the ubiquitous pro-individualism propaganda permeating American print and electronic media. Religion is unusually common and powerful in the U.S. so rationalists have more negative affect towards it and anything that resembles it even slightly.
Presumably the minority of people who for whatever reason strongly feel this way (whether rightly or wrongly), are the most likely to self-identify as rationalists.
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult. Or why they so desperately (and publicly!) reject the prospect that anyone might see them as having a leader. I mean, I understand that there’s a normative component to both, but I don’t get where the sheer power of this fear comes from. It’s probably an important aspect of our lack of coordination, but I don’t understand it.
There are some good reasons for being terrified. We are tribal animals. We don’t really care about the truth as such, but we care a lot about tribal politics. We can pursue truth when we have very high degree of disinterest in what the truth will be, but that’s a really exceptional situation. When we care about the shape of truth, we lose a lot of rationality points, and tribal politics forces make us care a lot about things different than truth. It might be our strongest instinct, even stronger than individual survival or sex drive.
I agree with you that it has its downsides, but I really don’t see how you can accept the politics and stay rational. I cannot think of many examples of that.
I’m also really disappointed by so many status indicators all over lesswrong—top contributors on every page. Your social status points (karma) on every page. User names and points on absolutely everything. Vote up / vote down. You might think we’re doing fine, but so was reddit when it was tiny, let’s see how it scales up. I think we should get rid of as much of that as we can. reddit’s quality of discussion is a lot lower than 4chan’s, even though it’s much smaller.
And this is a great example of what you once posted about—different people are annoyed by different biases. You seem to think social status and politics are mostly harmless and may even be useful, I think it’s the worst poison for clear rational thinking, and I haven’t seen many convincing examples of it being useful.
Well I don’t know that I’ve got any “rationalist” cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people’s capacity for thinking straightforwardly about reality. (And I could easily lump “religion” in with “cult” here in that regard).
Basically, I don’t like the way things I’d call “cultish” seem to disconnect people from concrete reality in favor of abstractions. I’ve seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at “indoctrination” into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I’m terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what’s actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don’t give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don’t want anyone messing with my mind.
“a sort of cult,” But not a cult, full stop. Multi-level-marketers? I have seen some hideous zombification in that context.
It was some kind of “neurolinguistic programming” thing. This particular incarnation of it entailed my first being yelled at until “[my] defenses were stripped away”, at which point I was supposed to accept this guy as a “master”. Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that “felt like torture” but which they’d “come to appreciate”.
I didn’t go to any camp sessions and probably wouldn’t have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
That sounds more like est or Landmark/Forum or even Scientology… but nonetheless a LGAT (large-group awareness training—basically a synonym for cult indoctrination).
Legitimate NLP training doesn’t involve students getting yelled at, even offhandedly, let alone in any sort of systematic way. Anybody who claims to be teaching NLP in such a fashion needs to be reported to the organization that issued their certification, and then to the Society of NLP (so the organization’s trainer-training certification can be revoked, if they don’t revoke the trainer’s cert).
(That link goes to a particular training organization, but I don’t have any connection to them or offer any particular endorsement; it’s just a page with good buyers’ guidelines for ANY sort of training, let alone NLP. I’d also add that a legitimate NLP trainer will generally have enough work teaching paying customers, to have neither time nor reason to subject people to unsolicited “training”.)
Status/self-image fears are among the most powerful human fears… and the status-behavior link is learned. (In my work, I routinely help people shed these sorts of fears, as they’re a prominent source of irrationality, stress, procrastination… you name it.)
Basically, you experience one or more situations (most often just one) where a particular behavior pattern is linked to shaming, ridicule, rejection, or some other basic social negative reinforcer. It doesn’t even have to happen to the person directly; it can just be an observation of the response to someone else’s behavior. Under stress, the person then makes a snap judgment as to what the causes of the situation were, and learns to do TWO things:
To internalize the same response to themselves if they express that behavior, and
To have the same response to others having that behavior.
It also works in reverse—if somebody does something bad to you, you learn to direct anger or attempts at ridicule towards that behavior, and also against yourself, as a result of “judging” the behavior itself to be bad, and the marker of a specific social group or class of people.
This can then manifest in odd ways, like not wanting to exhibit behaviors that would mark you as a member of the group you dislike.
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a “dupe” pulled into a “scam” and “cult” situation. Essentially, if you have learned that some group you scorn (e.g. “suckers” or “fools” or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself… although it usually happens at a young enough age and under stressful enough conditions that you weren’t thinking very clearly at the time.
But once you’ve examined the actual evidence used, it’s possible to let go of the judgments involved, and then the feelings go away.
For one thing, it would mean that they’ve been wearing a clown suit for years – and a sort of clown suit that a large part of their identity is defined in opposition to. How humiliating is that?
Ditto fear of being scammed by cryonics, which people seem to regularly treat as the worst thing that could possibly happen. Bad not to conform in belief, worse to be (exposed as) a nonconforming exploitable moron.
Note that hindsight bias can be expected to make being scammed/joining a cult look more moronic than it actually was, and the fundamental attribution error can be expected to make this reflect more badly on the actor than it should.
This still leaves your point that “the possibility of humanity being wiped out seems to have less psychological force than the opportunity to lose five pounds”, but near/far probably accounts sufficiently for that.
For Americans (and the cryonics organizations are American) some special factors apply. David Brin has some nice discussion of the ubiquitous pro-individualism propaganda permeating American print and electronic media. Religion is unusually common and powerful in the U.S. so rationalists have more negative affect towards it and anything that resembles it even slightly.
Presumably the minority of people who for whatever reason strongly feel this way (whether rightly or wrongly), are the most likely to self-identify as rationalists.