This might be easier to see when you consider how, from an outside perspective, many behaviors of the Rationality community that are, in fact, fine might seem cultish. Consider, for example, the numerous group houses, hero-worship of Eliezer, the tendency among Rationalists to hang out only with other Rationalists, the literal take over the world plan (AI), the prevalence of unusual psychological techniques (e.g., rationality training, circling), and the large number of other unusual cultural practices that are common in this community. To the outside world, these are cult-like behaviors. They do not seem cultish to Rationalists because the Rationality community is a well-liked ingroup and not a distrusted outgroup.
I think there’s actually been a whole lot of discourse and thought about Are Rationalists A Cult, focusing on some of this same stuff? I think the most reasonable and true answers to this are generally along the lines of “the word ‘cult’ bundles together some weird but neutral stuff and some legitimately concerning stuff and some actually horrifying stuff, and rationalists-as-a-whole do some of the weird neutral stuff and occasionally (possibly more often than population baseline but not actually that often) veer into the legitimately concerning stuff and do not really do the actually horrifying stuff”. This post, as I read it, is making the case that Leverage veered far more strongly into the “legitimately concerning” region of cult-adjacent space, and perhaps made contact with “actually horrifying”-space.
Notably out of your examples, some are actually bad imo? “Hero-worship of Eliezer” is imo bad, and also happily is not really much of a thing in at least the parts of ratspace I hang out in; “the tendency of rationalists to hang out with only other rationalists” is I think also not great and I think if taken to an extreme would be a pretty worrying sign, but in fact most rationalists I know do maintain social ties (including close ones) outside this group.
Unusual rationalist psychological techniques span a pretty wide range, and I have sometimes heard descriptions of such techniques/practices/dynamics and been wary or alarmed, and talked to other rationalists who had similar reactions (which I say not to invoke the authority of an invisible crowd that agrees with me but to note that rationalists do sometimes have negative “immune” responses to practices invented by other rationalists even if they’re not associated with a specific disliked subgroup). Sort of similarly re: “take over the world plan”, I do not really know enough about any specific person or group’s AI-related aspirations to say how fair a summary that is, but… I think the more a fair summary it is, the more potentially worrying that is?
Which is to say, I do think that there are pretty neutral aspects of rationalist community (the group houses, the weird ingroup jargon, the enthusiasm for making everything a ritual) that may trip people’s “this makes me think of cults” flag but are not actually worrying, but I don’t think this means that rationalists should turn off their, uh, cult-detectors? Central-examples-of-cults do actually cause harm, and we do actually want to avoid those failure modes.
There is a huge difference between “tendency to hang out with other Rationalists” and having mandatory therapy sessions with your supervisor or having to ask for permission to write a personal blog.
I think there’s actually been a whole lot of discourse and thought about Are Rationalists A Cult, focusing on some of this same stuff? I think the most reasonable and true answers to this are generally along the lines of “the word ‘cult’ bundles together some weird but neutral stuff and some legitimately concerning stuff and some actually horrifying stuff, and rationalists-as-a-whole do some of the weird neutral stuff and occasionally (possibly more often than population baseline but not actually that often) veer into the legitimately concerning stuff and do not really do the actually horrifying stuff”. This post, as I read it, is making the case that Leverage veered far more strongly into the “legitimately concerning” region of cult-adjacent space, and perhaps made contact with “actually horrifying”-space.
Notably out of your examples, some are actually bad imo? “Hero-worship of Eliezer” is imo bad, and also happily is not really much of a thing in at least the parts of ratspace I hang out in; “the tendency of rationalists to hang out with only other rationalists” is I think also not great and I think if taken to an extreme would be a pretty worrying sign, but in fact most rationalists I know do maintain social ties (including close ones) outside this group.
Unusual rationalist psychological techniques span a pretty wide range, and I have sometimes heard descriptions of such techniques/practices/dynamics and been wary or alarmed, and talked to other rationalists who had similar reactions (which I say not to invoke the authority of an invisible crowd that agrees with me but to note that rationalists do sometimes have negative “immune” responses to practices invented by other rationalists even if they’re not associated with a specific disliked subgroup). Sort of similarly re: “take over the world plan”, I do not really know enough about any specific person or group’s AI-related aspirations to say how fair a summary that is, but… I think the more a fair summary it is, the more potentially worrying that is?
Which is to say, I do think that there are pretty neutral aspects of rationalist community (the group houses, the weird ingroup jargon, the enthusiasm for making everything a ritual) that may trip people’s “this makes me think of cults” flag but are not actually worrying, but I don’t think this means that rationalists should turn off their, uh, cult-detectors? Central-examples-of-cults do actually cause harm, and we do actually want to avoid those failure modes.
There is a huge difference between “tendency to hang out with other Rationalists” and having mandatory therapy sessions with your supervisor or having to ask for permission to write a personal blog.