I think we should be a little careful of using the word “cult” as a mental stop sign, since that does seem to be what’s happening here. We need to be a bit more careful about labeling something with all the bad connotations of a cult just because it has some of the properties of a cult—especially if it only seems to have the good properties. But… that doesn’t mean that this good cult property won’t lead to the bad cult property or properties that we don’t want. You should just be more explicit as to what and how, because I’m wavering back and forth on this article being a really, really good idea (the benefits of this plan are obvious!), and a really, really scary and bad idea (if I do it, it’ll make me become part of a groupthinky monster!).
The problem I have is that both sides in my own head seem to be influenced by their own clear cognitive biases—we have the cult attractor on one hand and the accidental negative connotations and stopsigny nature of the word “cult” on the other. So if you could semi-explicitly show why adopting the idea this article puts forth would lead to some specific serious negative consequences, that would clear up my own indecision and confusion.
I’m afraid my comments were mostly driven by an inarticulate fear of cults and of association with a group as cultish as Mormons. But one specific thing I already said I’m afraid of is that of LW becoming a “rational” community instead of a rational community, differing from other communities only by the flag it rallies around.
You know what… I was missing the “look for a third option” bit. There are more options than the two obvious ones—doing this, and not doing this.
I’ve been having trouble making myself do the rationalisty projects that I came up with for myself, and this article suggested a way to use group pressures to make me do these things. Since I really wanted a way to make myself do these projects, the article seemed like a really, really good idea. But instead of making the whole community do this, I can actually just ask some fellow rationalists at my meetup to do this, just to me. That way I can use group pressures to help impose my own rationally determined second order desires on myself. The only thing I think I lose this way is the motivation via a sense of community, where everyone else is doing it too...
Of course, this still doesn’t resolve the problem of whether or not the community at large should adopt the ideas put forth in this article. I still can’t seem to think rationally about it. But at least this is a way for me to get what I want without having to worry about the negative side effects of the whole community adopting this policy.
But one specific thing I already said I’m afraid of is that of LW becoming a “rational” community instead of a rational community, differing from other communities only by the flag it rallies around.
If you took a typical community and replaced its flag with one that said “be rational”, would you expect the effect to be positive, negative, and neutral?
You might think that a belief system which praised “reason” and “rationality” and “individualism” would have gained some kind of special immunity, somehow...?
Well, it didn’t.
It worked around as well as putting a sign saying “Cold” on a refrigerator that wasn’t plugged in.
Conjecture: Sufficiently dedicated groups that do not take measures against “bad cult properties” will fall down the cult attractor. So if you want a group to not fall down the attractor, you have to think about bad cult properties and how to avoid them.
Various folks have come up with lists of just what “bad cult properties” are; one of my favorites is Isaac Bonewits’ “ABCDEF”. Bonewits’ motivation appears to have been to help people be more comfortable involving themselves in unusual groups (he was a neopagan leader) by spelling out what sorts of group behavior were actually worth being worried about.
I won’t repeat Bonewits’ list here. I think it’s worth noting, though, that several of the properties he outlines could be described as anti-epistemology in practice.
Having read the list, LW-as-it-was-last-week-and-presumably-is-now seems to be unsurprisingly good at not being a cult. It does occur to me that we might want to take a close look at how the incentives offered by the group to its members will change if we switch to a more recruitment-oriented mode, though.
Conjecture: Sufficiently dedicated groups that do not take measures against “bad cult properties” will fall down the cult attractor. So if you want a group to not fall down the attractor, you have to think about bad cult properties and how to avoid them.
Yeah, I figured I wasn’t going to be too worried about LW’s cultishness unless/until rules for sexual behavior got handed down, to which Eliezer was exempt.
I would have thought so too, but a lot of people here are obviously loving this stuff.
And that’s kind of frightening too. I don’t think it’s too much of an exaggeration to say that this stuff is basically a cult roadmap.
I think we should be a little careful of using the word “cult” as a mental stop sign, since that does seem to be what’s happening here. We need to be a bit more careful about labeling something with all the bad connotations of a cult just because it has some of the properties of a cult—especially if it only seems to have the good properties. But… that doesn’t mean that this good cult property won’t lead to the bad cult property or properties that we don’t want. You should just be more explicit as to what and how, because I’m wavering back and forth on this article being a really, really good idea (the benefits of this plan are obvious!), and a really, really scary and bad idea (if I do it, it’ll make me become part of a groupthinky monster!).
The problem I have is that both sides in my own head seem to be influenced by their own clear cognitive biases—we have the cult attractor on one hand and the accidental negative connotations and stopsigny nature of the word “cult” on the other. So if you could semi-explicitly show why adopting the idea this article puts forth would lead to some specific serious negative consequences, that would clear up my own indecision and confusion.
I’m afraid my comments were mostly driven by an inarticulate fear of cults and of association with a group as cultish as Mormons. But one specific thing I already said I’m afraid of is that of LW becoming a “rational” community instead of a rational community, differing from other communities only by the flag it rallies around.
You know what… I was missing the “look for a third option” bit. There are more options than the two obvious ones—doing this, and not doing this.
I’ve been having trouble making myself do the rationalisty projects that I came up with for myself, and this article suggested a way to use group pressures to make me do these things. Since I really wanted a way to make myself do these projects, the article seemed like a really, really good idea. But instead of making the whole community do this, I can actually just ask some fellow rationalists at my meetup to do this, just to me. That way I can use group pressures to help impose my own rationally determined second order desires on myself. The only thing I think I lose this way is the motivation via a sense of community, where everyone else is doing it too...
Of course, this still doesn’t resolve the problem of whether or not the community at large should adopt the ideas put forth in this article. I still can’t seem to think rationally about it. But at least this is a way for me to get what I want without having to worry about the negative side effects of the whole community adopting this policy.
If you took a typical community and replaced its flag with one that said “be rational”, would you expect the effect to be positive, negative, and neutral?
I don’t really know, but I’ll note that Scientologists are known to laud “sanity”, and Objectivists were all about “reason”.
Rationality flags don’t seem to help that much.
No, I wouldn’t.
Conjecture: Sufficiently dedicated groups that do not take measures against “bad cult properties” will fall down the cult attractor. So if you want a group to not fall down the attractor, you have to think about bad cult properties and how to avoid them.
Various folks have come up with lists of just what “bad cult properties” are; one of my favorites is Isaac Bonewits’ “ABCDEF”. Bonewits’ motivation appears to have been to help people be more comfortable involving themselves in unusual groups (he was a neopagan leader) by spelling out what sorts of group behavior were actually worth being worried about.
I won’t repeat Bonewits’ list here. I think it’s worth noting, though, that several of the properties he outlines could be described as anti-epistemology in practice.
Having read the list, LW-as-it-was-last-week-and-presumably-is-now seems to be unsurprisingly good at not being a cult. It does occur to me that we might want to take a close look at how the incentives offered by the group to its members will change if we switch to a more recruitment-oriented mode, though.
See also.
Yeah, I figured I wasn’t going to be too worried about LW’s cultishness unless/until rules for sexual behavior got handed down, to which Eliezer was exempt.
Could you deconstruct this for me? I kind of understand where you’re coming from, but I’d like to see your exact reasoning.