I’ve referenced this post, or at least this concept, in the past year. I think it’s fairly important. I’ve definitely seen this dynamic. I’ve felt it as a participant who totally wants a responsible authority figure to look up to and follow, and I’ve seen in how people respond to various teacher-figures in the rationalsphere.
I think the rationalsphere lucked out in its founding members being pretty wise, and going out of their way to try to ameliorate a lot of the effects here, and still those people end up getting treated in a weird cult-leader-y way even when they’re trying not to. (I recall one community leader telling me, 8 years ago, “look I don’t know anything please don’t overupdate on what I say” and somehow that made them feel like they were even more wise and I was treating them like Yoda even more.)
My thoughts on it are somewhat connected to “In Defense of Attempting Hard Things” and discussion surrounding Leverage Research (which was triggered by a particular writeup by Zoe, but I think was more of a broader set of pent up frustrations). The rationalsphere/longtermist/EAcosystem are trying to do pretty hard things. Pretty hard things often require both commitment/dedication, and willingness to try weird strategies. This combination tends to produce cults or cult-adjacent things as a byproduct, which is worrisome and bad, but, man, it’s still important to actually try the hard things.
The “hard things / weirdness” → “cultishness” model is separate from the model in this post, but the fact that that (I think) Hard Weird Communities are important, makes the failure modes of the OP more costly.
This year I ran into a person who seemed to be accidentally attracting a cult around them, despite them seeming really innocuous in a lot of ways. I don’t think I directly referred them them to this post but having the concept handy was to talk to them was useful.
I’ve referenced this post, or at least this concept, in the past year. I think it’s fairly important. I’ve definitely seen this dynamic. I’ve felt it as a participant who totally wants a responsible authority figure to look up to and follow, and I’ve seen in how people respond to various teacher-figures in the rationalsphere.
I think the rationalsphere lucked out in its founding members being pretty wise, and going out of their way to try to ameliorate a lot of the effects here, and still those people end up getting treated in a weird cult-leader-y way even when they’re trying not to. (I recall one community leader telling me, 8 years ago, “look I don’t know anything please don’t overupdate on what I say” and somehow that made them feel like they were even more wise and I was treating them like Yoda even more.)
My thoughts on it are somewhat connected to “In Defense of Attempting Hard Things” and discussion surrounding Leverage Research (which was triggered by a particular writeup by Zoe, but I think was more of a broader set of pent up frustrations). The rationalsphere/longtermist/EAcosystem are trying to do pretty hard things. Pretty hard things often require both commitment/dedication, and willingness to try weird strategies. This combination tends to produce cults or cult-adjacent things as a byproduct, which is worrisome and bad, but, man, it’s still important to actually try the hard things.
The “hard things / weirdness” → “cultishness” model is separate from the model in this post, but the fact that that (I think) Hard Weird Communities are important, makes the failure modes of the OP more costly.
This year I ran into a person who seemed to be accidentally attracting a cult around them, despite them seeming really innocuous in a lot of ways. I don’t think I directly referred them them to this post but having the concept handy was to talk to them was useful.