I blame social status. Well, I blame social status and other primate tribal psychology for most biases people have. You’re basically accepting Eliezer as your personal guru and tribal leader, and following him mindlessly, especially when others seem to be doing so too. This worked great when you tried to get your group into power in a tribe, it’s a pretty stupid thing to do these days.
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult. Or why they so desperately (and publicly!) reject the prospect that anyone might see them as having a leader. I mean, I understand that there’s a normative component to both, but I don’t get where the sheer power of this fear comes from. It’s probably an important aspect of our lack of coordination, but I don’t understand it.
There are some good reasons for being terrified. We are tribal animals. We don’t really care about the truth as such, but we care a lot about tribal politics. We can pursue truth when we have very high degree of disinterest in what the truth will be, but that’s a really exceptional situation. When we care about the shape of truth, we lose a lot of rationality points, and tribal politics forces make us care a lot about things different than truth. It might be our strongest instinct, even stronger than individual survival or sex drive.
I agree with you that it has its downsides, but I really don’t see how you can accept the politics and stay rational. I cannot think of many examples of that.
I’m also really disappointed by so many status indicators all over lesswrong—top contributors on every page. Your social status points (karma) on every page. User names and points on absolutely everything. Vote up / vote down. You might think we’re doing fine, but so was reddit when it was tiny, let’s see how it scales up. I think we should get rid of as much of that as we can. reddit’s quality of discussion is a lot lower than 4chan’s, even though it’s much smaller.
And this is a great example of what you once posted about—different people are annoyed by different biases. You seem to think social status and politics are mostly harmless and may even be useful, I think it’s the worst poison for clear rational thinking, and I haven’t seen many convincing examples of it being useful.
Well I don’t know that I’ve got any “rationalist” cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people’s capacity for thinking straightforwardly about reality. (And I could easily lump “religion” in with “cult” here in that regard).
Basically, I don’t like the way things I’d call “cultish” seem to disconnect people from concrete reality in favor of abstractions. I’ve seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at “indoctrination” into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I’m terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what’s actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don’t give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don’t want anyone messing with my mind.
It was some kind of “neurolinguistic programming” thing. This particular incarnation of it entailed my first being yelled at until “[my] defenses were stripped away”, at which point I was supposed to accept this guy as a “master”. Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that “felt like torture” but which they’d “come to appreciate”.
I didn’t go to any camp sessions and probably wouldn’t have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
That sounds more like est or Landmark/Forum or even Scientology… but nonetheless a LGAT (large-group awareness training—basically a synonym for cult indoctrination).
Legitimate NLP training doesn’t involve students getting yelled at, even offhandedly, let alone in any sort of systematic way. Anybody who claims to be teaching NLP in such a fashion needs to be reported to the organization that issued their certification, and then to the Society of NLP (so the organization’s trainer-training certification can be revoked, if they don’t revoke the trainer’s cert).
(That link goes to a particular training organization, but I don’t have any connection to them or offer any particular endorsement; it’s just a page with good buyers’ guidelines for ANY sort of training, let alone NLP. I’d also add that a legitimate NLP trainer will generally have enough work teaching paying customers, to have neither time nor reason to subject people to unsolicited “training”.)
I don’t get where the sheer power of this fear comes from.
Status/self-image fears are among the most powerful human fears… and the status-behavior link is learned. (In my work, I routinely help people shed these sorts of fears, as they’re a prominent source of irrationality, stress, procrastination… you name it.)
Basically, you experience one or more situations (most often just one) where a particular behavior pattern is linked to shaming, ridicule, rejection, or some other basic social negative reinforcer. It doesn’t even have to happen to the person directly; it can just be an observation of the response to someone else’s behavior. Under stress, the person then makes a snap judgment as to what the causes of the situation were, and learns to do TWO things:
To internalize the same response to themselves if they express that behavior, and
To have the same response to others having that behavior.
It also works in reverse—if somebody does something bad to you, you learn to direct anger or attempts at ridicule towards that behavior, and also against yourself, as a result of “judging” the behavior itself to be bad, and the marker of a specific social group or class of people.
This can then manifest in odd ways, like not wanting to exhibit behaviors that would mark you as a member of the group you dislike.
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a “dupe” pulled into a “scam” and “cult” situation. Essentially, if you have learned that some group you scorn (e.g. “suckers” or “fools” or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself… although it usually happens at a young enough age and under stressful enough conditions that you weren’t thinking very clearly at the time.
But once you’ve examined the actual evidence used, it’s possible to let go of the judgments involved, and then the feelings go away.
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult.
For one thing, it would mean that they’ve been wearing a clown suit for years – and a sort of clown suit that a large part of their identity is defined in opposition to. How humiliating is that?
Ditto fear of being scammed by cryonics, which people seem to regularly treat as the worst thing that could possibly happen. Bad not to conform in belief, worse to be (exposed as) a nonconforming exploitable moron.
Note that hindsight bias can be expected to make being scammed/joining a cult look more moronic than it actually was, and the fundamental attribution error can be expected to make this reflect more badly on the actor than it should.
For Americans (and the cryonics organizations are American) some special factors apply. David Brin has some nice discussion of the ubiquitous pro-individualism propaganda permeating American print and electronic media. Religion is unusually common and powerful in the U.S. so rationalists have more negative affect towards it and anything that resembles it even slightly.
Presumably the minority of people who for whatever reason strongly feel this way (whether rightly or wrongly), are the most likely to self-identify as rationalists.
Meh. Paul Graham’s blog doesn’t allow comments. Neither does Stallman’s. And if you read OB via an RSS feed, there is no indication anywhere that other people are following along. And believing Eliezer is smart and right about a bunch of things doesn’t mean mindlessly following him on everything.
It doesn’t matter that Paul Graham and Stallman don’t allow comments. People know them, they have very high reputation, and plenty of fanboys, that all makes them high social status individuals. Mindlessly following the leader is not the same as mindlessly following the group, both are real and distinct behaviours.
People feel differently reading something by Paul Graham and something by blogger they’ve never heard about. You might have gotten so used to social status indicators you don’t consciously see them. Go to 4chan (not /b/), see what discussion is like without them. It is actually surprisingly good.
People feel differently reading something by Paul Graham and something by blogger they’ve never heard about.
Which people, and how do you know? The first time I read a PG essay, I’d never heard of him. I think you’re confusing the cause and effect about people following—at least where some people are concerned. PG, RMS, and EY aren’t convincing because they have followers, they have followers because they’re convincing.
Now, if you’re saying the status indicators are in their writing, then that’s another story. It’s arguably a status symbol merely to speak possibly-unpopular and/ore weird opinions in an authoritative voice without weaseling or implying that you’re a persecuted minority or even so much as dignifying the possibility that people might disagree with you.
This is mostly agreeing to the same point, but I’m going to say it anyway because I think it’s important.
I stumbled on Eliezer’s writing fairly randomly (link to OB as an interesting blog). I was immediately sucked in. In fact, I was discussing the subject of modern-day genius with a friend, and after having read two or three of his posts, I sent my friend a link saying something like “this Eliezer guy seems like a pretty legit modern genius.” [He replied with “psshhh… he’s just working in a hyped-up field.” (I don’ t think he really read the posts)]. I had absolutely no idea of the depth of his ideas nor any of the broader social context at the time. I just knew it was making sense.
Same with Paul Graham. I stumbled on his website even more randomly. I did a google search for “procrastination” while procrastinating one night. And I was hooked. Again, I had no idea about his accomplishments or social status or associations, I just knew that his writing resonated with me.
What it is for me is a deep connection with the ideas in the writing. It’s not just a matter of “hmm… interesting idea,” but rather “WOW. That’s EXACTLY how I feel. But explained so much more clearly.”
I could lump Ayn Rand into the same group to an extent.
I agree that the “cultishness” is somewhat disconcerting. But I think there’s much more to it than that. I think the fact that the names of three of the writers whose writing has deeply resonated with me philosophically, writers who I have come across through completely different means, have been mentioned in the comments in this post, is very telling. I suspect that people are predisposed to a certain way of understanding the world, and when they find ideas that resonate with that understanding, they latch on. It’s just that some people are much better at communicating, or make the effort to communicate, these ideas.
(This comment opens a can of worms as it could imply that there are various correct ways or understanding the world, and that rationalism is not necessarily THE way. But perhaps certain people are more predisposed to the idea of rationalism? And perhaps it is THE way, but certain people can just never come close to overcoming their views of the world imposed from their upbringing to have the ideas resonate with them?)
Either way, my main point is that it’s not just a matter of blind worship.
I blame social status. Well, I blame social status and other primate tribal psychology for most biases people have. You’re basically accepting Eliezer as your personal guru and tribal leader, and following him mindlessly, especially when others seem to be doing so too. This worked great when you tried to get your group into power in a tribe, it’s a pretty stupid thing to do these days.
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult. Or why they so desperately (and publicly!) reject the prospect that anyone might see them as having a leader. I mean, I understand that there’s a normative component to both, but I don’t get where the sheer power of this fear comes from. It’s probably an important aspect of our lack of coordination, but I don’t understand it.
There are some good reasons for being terrified. We are tribal animals. We don’t really care about the truth as such, but we care a lot about tribal politics. We can pursue truth when we have very high degree of disinterest in what the truth will be, but that’s a really exceptional situation. When we care about the shape of truth, we lose a lot of rationality points, and tribal politics forces make us care a lot about things different than truth. It might be our strongest instinct, even stronger than individual survival or sex drive.
I agree with you that it has its downsides, but I really don’t see how you can accept the politics and stay rational. I cannot think of many examples of that.
I’m also really disappointed by so many status indicators all over lesswrong—top contributors on every page. Your social status points (karma) on every page. User names and points on absolutely everything. Vote up / vote down. You might think we’re doing fine, but so was reddit when it was tiny, let’s see how it scales up. I think we should get rid of as much of that as we can. reddit’s quality of discussion is a lot lower than 4chan’s, even though it’s much smaller.
And this is a great example of what you once posted about—different people are annoyed by different biases. You seem to think social status and politics are mostly harmless and may even be useful, I think it’s the worst poison for clear rational thinking, and I haven’t seen many convincing examples of it being useful.
Well I don’t know that I’ve got any “rationalist” cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people’s capacity for thinking straightforwardly about reality. (And I could easily lump “religion” in with “cult” here in that regard).
Basically, I don’t like the way things I’d call “cultish” seem to disconnect people from concrete reality in favor of abstractions. I’ve seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at “indoctrination” into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I’m terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what’s actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don’t give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don’t want anyone messing with my mind.
“a sort of cult,” But not a cult, full stop. Multi-level-marketers? I have seen some hideous zombification in that context.
It was some kind of “neurolinguistic programming” thing. This particular incarnation of it entailed my first being yelled at until “[my] defenses were stripped away”, at which point I was supposed to accept this guy as a “master”. Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that “felt like torture” but which they’d “come to appreciate”.
I didn’t go to any camp sessions and probably wouldn’t have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
That sounds more like est or Landmark/Forum or even Scientology… but nonetheless a LGAT (large-group awareness training—basically a synonym for cult indoctrination).
Legitimate NLP training doesn’t involve students getting yelled at, even offhandedly, let alone in any sort of systematic way. Anybody who claims to be teaching NLP in such a fashion needs to be reported to the organization that issued their certification, and then to the Society of NLP (so the organization’s trainer-training certification can be revoked, if they don’t revoke the trainer’s cert).
(That link goes to a particular training organization, but I don’t have any connection to them or offer any particular endorsement; it’s just a page with good buyers’ guidelines for ANY sort of training, let alone NLP. I’d also add that a legitimate NLP trainer will generally have enough work teaching paying customers, to have neither time nor reason to subject people to unsolicited “training”.)
Status/self-image fears are among the most powerful human fears… and the status-behavior link is learned. (In my work, I routinely help people shed these sorts of fears, as they’re a prominent source of irrationality, stress, procrastination… you name it.)
Basically, you experience one or more situations (most often just one) where a particular behavior pattern is linked to shaming, ridicule, rejection, or some other basic social negative reinforcer. It doesn’t even have to happen to the person directly; it can just be an observation of the response to someone else’s behavior. Under stress, the person then makes a snap judgment as to what the causes of the situation were, and learns to do TWO things:
To internalize the same response to themselves if they express that behavior, and
To have the same response to others having that behavior.
It also works in reverse—if somebody does something bad to you, you learn to direct anger or attempts at ridicule towards that behavior, and also against yourself, as a result of “judging” the behavior itself to be bad, and the marker of a specific social group or class of people.
This can then manifest in odd ways, like not wanting to exhibit behaviors that would mark you as a member of the group you dislike.
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a “dupe” pulled into a “scam” and “cult” situation. Essentially, if you have learned that some group you scorn (e.g. “suckers” or “fools” or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself… although it usually happens at a young enough age and under stressful enough conditions that you weren’t thinking very clearly at the time.
But once you’ve examined the actual evidence used, it’s possible to let go of the judgments involved, and then the feelings go away.
For one thing, it would mean that they’ve been wearing a clown suit for years – and a sort of clown suit that a large part of their identity is defined in opposition to. How humiliating is that?
Ditto fear of being scammed by cryonics, which people seem to regularly treat as the worst thing that could possibly happen. Bad not to conform in belief, worse to be (exposed as) a nonconforming exploitable moron.
Note that hindsight bias can be expected to make being scammed/joining a cult look more moronic than it actually was, and the fundamental attribution error can be expected to make this reflect more badly on the actor than it should.
This still leaves your point that “the possibility of humanity being wiped out seems to have less psychological force than the opportunity to lose five pounds”, but near/far probably accounts sufficiently for that.
For Americans (and the cryonics organizations are American) some special factors apply. David Brin has some nice discussion of the ubiquitous pro-individualism propaganda permeating American print and electronic media. Religion is unusually common and powerful in the U.S. so rationalists have more negative affect towards it and anything that resembles it even slightly.
Presumably the minority of people who for whatever reason strongly feel this way (whether rightly or wrongly), are the most likely to self-identify as rationalists.
Meh. Paul Graham’s blog doesn’t allow comments. Neither does Stallman’s. And if you read OB via an RSS feed, there is no indication anywhere that other people are following along. And believing Eliezer is smart and right about a bunch of things doesn’t mean mindlessly following him on everything.
It doesn’t matter that Paul Graham and Stallman don’t allow comments. People know them, they have very high reputation, and plenty of fanboys, that all makes them high social status individuals. Mindlessly following the leader is not the same as mindlessly following the group, both are real and distinct behaviours.
People feel differently reading something by Paul Graham and something by blogger they’ve never heard about. You might have gotten so used to social status indicators you don’t consciously see them. Go to 4chan (not /b/), see what discussion is like without them. It is actually surprisingly good.
Which people, and how do you know? The first time I read a PG essay, I’d never heard of him. I think you’re confusing the cause and effect about people following—at least where some people are concerned. PG, RMS, and EY aren’t convincing because they have followers, they have followers because they’re convincing.
Now, if you’re saying the status indicators are in their writing, then that’s another story. It’s arguably a status symbol merely to speak possibly-unpopular and/ore weird opinions in an authoritative voice without weaseling or implying that you’re a persecuted minority or even so much as dignifying the possibility that people might disagree with you.
This is mostly agreeing to the same point, but I’m going to say it anyway because I think it’s important.
I stumbled on Eliezer’s writing fairly randomly (link to OB as an interesting blog). I was immediately sucked in. In fact, I was discussing the subject of modern-day genius with a friend, and after having read two or three of his posts, I sent my friend a link saying something like “this Eliezer guy seems like a pretty legit modern genius.” [He replied with “psshhh… he’s just working in a hyped-up field.” (I don’ t think he really read the posts)]. I had absolutely no idea of the depth of his ideas nor any of the broader social context at the time. I just knew it was making sense.
Same with Paul Graham. I stumbled on his website even more randomly. I did a google search for “procrastination” while procrastinating one night. And I was hooked. Again, I had no idea about his accomplishments or social status or associations, I just knew that his writing resonated with me.
What it is for me is a deep connection with the ideas in the writing. It’s not just a matter of “hmm… interesting idea,” but rather “WOW. That’s EXACTLY how I feel. But explained so much more clearly.”
I could lump Ayn Rand into the same group to an extent.
I agree that the “cultishness” is somewhat disconcerting. But I think there’s much more to it than that. I think the fact that the names of three of the writers whose writing has deeply resonated with me philosophically, writers who I have come across through completely different means, have been mentioned in the comments in this post, is very telling. I suspect that people are predisposed to a certain way of understanding the world, and when they find ideas that resonate with that understanding, they latch on. It’s just that some people are much better at communicating, or make the effort to communicate, these ideas.
(This comment opens a can of worms as it could imply that there are various correct ways or understanding the world, and that rationalism is not necessarily THE way. But perhaps certain people are more predisposed to the idea of rationalism? And perhaps it is THE way, but certain people can just never come close to overcoming their views of the world imposed from their upbringing to have the ideas resonate with them?)
Either way, my main point is that it’s not just a matter of blind worship.