I think the way the term cult (or euphemisms like “high-demand group”) has been used by the OP and by many commenters in this thread is extremely unhelpful and, I suspect, not in keeping with the epistemic standards of this community.
At its core, labeling a group as a cult is an out-grouping power move used to distance the audience from that group’s perspective. You don’t need to understand their thoughts, explain their behavior, form a judgment on their merits. They’re a cult.
This might be easier to see when you consider how, from an outside perspective, many behaviors of the Rationality community that are, in fact, fine might seem cultish. Consider, for example, the numerous group houses, hero-worship of Eliezer, the tendency among Rationalists to hang out only with other Rationalists, the literal take over the world plan (AI), the prevalence of unusual psychological techniques (e.g., rationality training, circling), and the large number of other unusual cultural practices that are common in this community. To the outside world, these are cult-like behaviors. They do not seem cultish to Rationalists because the Rationality community is a well-liked ingroup and not a distrusted outgroup.
My understanding is that historically the Rationality community has had some difficulty in protecting itself from parasitic bad actors who have used their affiliation with this community to cause serious harm to others. Given that context, I understand why revisiting the topic of early Leverage might be compelling. I would suggest that the cult/no cult question will not be helpful here because the answer depends so largely on whether people liked or didn’t like Leverage. I think past events should demonstrate that this is not a reliable indicator of parasitic bad actors.
Some questions I would ask instead include: Did this group represent that they were affiliated with Rationality in order to achieve their ends? If so, did they engage in activities that are contrary to the norms of the Rationality community? Were people harmed by this group? If so, was that harm abnormal given the social context? Was that harm individual or institutional? Did those involved act responsibly given the circumstances? Etc.
Given my knowledge of Leverage 1.0 and my knowledge of the Rationality community, I am quite confident that Leverage was not the parasitic bad actor that you are looking for, but I think this is something the Rationality community should determine for itself and this seems like a fine time to do so.
However, I would also like to note that Leverage 1.0 has historically been on the receiving end of substantial levels of bullying, harassment, needless cruelty, public ridicule, and more by people who were not engaged in any legitimate epistemic activity. I do not think this is OK. I intend to call out this behavior directly when I see it. I would ask that others do so as well.
(I currently work at Leverage research but did not work at Leverage during Leverage 1.0 (although I interacted with Leverage 1.0 and know many of the people involved). Before working at Leverage I did EA community building at CEA between Summer 2014 and early 2019.)
This might be easier to see when you consider how, from an outside perspective, many behaviors of the Rationality community that are, in fact, fine might seem cultish. Consider, for example, the numerous group houses, hero-worship of Eliezer, the tendency among Rationalists to hang out only with other Rationalists, the literal take over the world plan (AI), the prevalence of unusual psychological techniques (e.g., rationality training, circling), and the large number of other unusual cultural practices that are common in this community. To the outside world, these are cult-like behaviors. They do not seem cultish to Rationalists because the Rationality community is a well-liked ingroup and not a distrusted outgroup.
I think there’s actually been a whole lot of discourse and thought about Are Rationalists A Cult, focusing on some of this same stuff? I think the most reasonable and true answers to this are generally along the lines of “the word ‘cult’ bundles together some weird but neutral stuff and some legitimately concerning stuff and some actually horrifying stuff, and rationalists-as-a-whole do some of the weird neutral stuff and occasionally (possibly more often than population baseline but not actually that often) veer into the legitimately concerning stuff and do not really do the actually horrifying stuff”. This post, as I read it, is making the case that Leverage veered far more strongly into the “legitimately concerning” region of cult-adjacent space, and perhaps made contact with “actually horrifying”-space.
Notably out of your examples, some are actually bad imo? “Hero-worship of Eliezer” is imo bad, and also happily is not really much of a thing in at least the parts of ratspace I hang out in; “the tendency of rationalists to hang out with only other rationalists” is I think also not great and I think if taken to an extreme would be a pretty worrying sign, but in fact most rationalists I know do maintain social ties (including close ones) outside this group.
Unusual rationalist psychological techniques span a pretty wide range, and I have sometimes heard descriptions of such techniques/practices/dynamics and been wary or alarmed, and talked to other rationalists who had similar reactions (which I say not to invoke the authority of an invisible crowd that agrees with me but to note that rationalists do sometimes have negative “immune” responses to practices invented by other rationalists even if they’re not associated with a specific disliked subgroup). Sort of similarly re: “take over the world plan”, I do not really know enough about any specific person or group’s AI-related aspirations to say how fair a summary that is, but… I think the more a fair summary it is, the more potentially worrying that is?
Which is to say, I do think that there are pretty neutral aspects of rationalist community (the group houses, the weird ingroup jargon, the enthusiasm for making everything a ritual) that may trip people’s “this makes me think of cults” flag but are not actually worrying, but I don’t think this means that rationalists should turn off their, uh, cult-detectors? Central-examples-of-cults do actually cause harm, and we do actually want to avoid those failure modes.
There is a huge difference between “tendency to hang out with other Rationalists” and having mandatory therapy sessions with your supervisor or having to ask for permission to write a personal blog.
Yeah, ‘cult’ is a vague term often overused. Yeah, a lot of rationality norms can be viewed as cultish.
How would you suggest referring to an ‘actual’ cult—or, if you prefer not to use that term at all, how would you suggest we describe something like scientology or nxivm? Obviously those are quite extreme, but I’m wondering if there is ‘any’ degree of group-controlling traits that you would be comfortable assigning the word cult to? Or if I refer to scientology as a cult, do you consider this an out-grouping power move used to distance people from scientology’s perspective?
I think the way the term cult (or euphemisms like “high-demand group”) has been used by the OP and by many commenters in this thread is extremely unhelpful and, I suspect, not in keeping with the epistemic standards of this community.
No. As demonstrated by this comment by Viliam, the word “cult” refers is a well-defined set of practices used to break people’s ability to think rationally. Leverage does not deny using these practices. To the contrary, it appears flagrantly indifferent to the abuse potential. Cult techniques of brainwashing an attractor of human social behavior. Eliezer Yudkowsky warned about this attractor. Your attempt to redefine cult more broadly is a signal you’re bullshitting us.
Your mind belongs to the group : In the description above there’s no mention of people needing to confess sins.
They call it “Belief Reporting”, it’s described in one of the documents that were removed from Internet Archive. The members are (were?) supposed to do it regularly with their manager. That is like “auditing” in Scientology, except instead of using an e-meter they rely on nerds being pathologically honest.
There’s no inherent need to confess having violated any rules and comitted sins in belief reporting.
It’s a debugging technique and while you can use any debugging technique to debug someone having comitted sins no one here who has closer information about leverage charged that they do that.
Scientology actually does force people to confess sins when they commit what they consider ethics violations (scientology calls their code of conduct ethics).
Anyone involved in scientology would easily classify what scientology does as including a need to confess sins. On the other hand, that’s far how the participants of belief reporting sessions at Leverage likely thought about it. At the moment there’s no source that anybody in Leverage got an impression that this is what happened to them.
It’s quite toxic for rational discussion to make those accusations instead of focus on the facts that are actually out in the open.
I learned belief reporting from a person who attended a Leverage workshop and haven’t had any direct face-to-face exposure to Leverage.
Belief reporting is a debugging technique. You have a personal issue you want to address. Then you look at related beliefs.
Leverage found that if someone sets an intention of “I will tell the truth” and then speaks out of a belief like “I’m a capable person” and they don’t believe that (at a deep level), they will have a physical sensation of resistance.
Afterwards, there’s an attempt to trace the belief to it’s roots. The person can then speak out various forms of “I’m not a capable person because X” and “I’m not a capable person because Y”. Then recursively the process gets applied to seek for the root. Often that allows uncovering that there’s some confusion at the base of the belief and then after having uncovered the confusion it’s possible to work the tree up again to get rid of the “I’m not a capable person” belief and switch it into “I’m a capable person”.
This often leads to discovering that one holds beliefs at a deep level that one’s system II considers silly but that still are the base of other beliefs and that affect our actions.
In my opinion, this sounds interesting as a confidential voluntary therapy, but Orwellian when:
Members who were on payroll were expected to undergo charting/debugging sessions with a supervisory “trainer”, and to “train” other members. The role of trainer is something like “manager + therapist”: that is, both “is evaluating your job performance” and “is doing therapy on you”.
So, your supervisor is debugging your beliefs, possibly related to your job performance, and you are supposed to not only tell the truth, but also “seek for the root”… and yet, in your opinion, this does not imply “having to confess violation of the rules or committed sins”?
What exactly happens when you start having doubts about the organization or the leader, and as a result your job performance drops, and then you are having the session with your manager? Would you admit, truthfully, “you know, recently I started having some doubts about whether we are really doing our best to improve the world, or just using the effective altruist community as a finshing pond for people who are idealistic and willing to sacrifice… and I guess these thoughts distract me from my tasks”, and then your therapist/manager is going to say… what?
Nothing written above suggests that doubt about central strategy would have been seen as sin, especially when it isn’t necessarily system II endorsed. It’s my understanding that talking about the theories of change through which Leverage is going to have an effect on the world was one of the main activities Leverage engaged in.
Besides the word sin is generally about taking actions that are in violation of norms of an organization. In the Scientology context it’s for example a sin to watch a documentary about Scientology on normal TV. In Christianity masturbation would be a sin.
Leverage doesn’t have a similar behavior codex that declares certain actions as sins that have to be confessed.
Role conflicts between being a manager and a therapist can easily produce problems but analysing them through a frame as it being about “confessing sins” is not an useful lense to think coherently about the involved problems.
You missed the part where this person was pointing out that there is Deliberately Vague Language used by the OP. Imo, this language doesn’t create enough of a structure for commenters to construct an adequate dialogue about several sub-topics in this thread.
Also, what’s “flagrantly indifferent” about Larissa wanting to hear out people who feel wronged?
You seem to be quite upset by all of this, why not reach out and let her know?
Nah, he’s alright. If someone calls a cult a cult, that’s not a reason to call them upset. Plus, he writes about plenty of other things; you’re the one with the new account made only to defend Leverage.
you’re the one with the new account made only to defend Leverage
The social pressure against defending Leverage is in the air, so anonymity shouldn’t be held against someone who does that, it’s already bad enough that there is a reason for anonymity.
If questioning the “rationality” of the discourse is defending them, then what do you suppose you’re doing?
I just don’t see the goals or values of this community reflected here and it confuses me. That’s why I made this account—to get clarity on what seems to me to be a total anomaly case in how the rationalist community members (at least as far as signaling goes, I guess) conduct themselves.
Because I’ve only seen what is classifiable as a hysteric response to this topic, the Leverage topic.
I think the way the term cult (or euphemisms like “high-demand group”) has been used by the OP and by many commenters in this thread is extremely unhelpful and, I suspect, not in keeping with the epistemic standards of this community.
At its core, labeling a group as a cult is an out-grouping power move used to distance the audience from that group’s perspective. You don’t need to understand their thoughts, explain their behavior, form a judgment on their merits. They’re a cult.
This might be easier to see when you consider how, from an outside perspective, many behaviors of the Rationality community that are, in fact, fine might seem cultish. Consider, for example, the numerous group houses, hero-worship of Eliezer, the tendency among Rationalists to hang out only with other Rationalists, the literal take over the world plan (AI), the prevalence of unusual psychological techniques (e.g., rationality training, circling), and the large number of other unusual cultural practices that are common in this community. To the outside world, these are cult-like behaviors. They do not seem cultish to Rationalists because the Rationality community is a well-liked ingroup and not a distrusted outgroup.
My understanding is that historically the Rationality community has had some difficulty in protecting itself from parasitic bad actors who have used their affiliation with this community to cause serious harm to others. Given that context, I understand why revisiting the topic of early Leverage might be compelling. I would suggest that the cult/no cult question will not be helpful here because the answer depends so largely on whether people liked or didn’t like Leverage. I think past events should demonstrate that this is not a reliable indicator of parasitic bad actors.
Some questions I would ask instead include: Did this group represent that they were affiliated with Rationality in order to achieve their ends? If so, did they engage in activities that are contrary to the norms of the Rationality community? Were people harmed by this group? If so, was that harm abnormal given the social context? Was that harm individual or institutional? Did those involved act responsibly given the circumstances? Etc.
Given my knowledge of Leverage 1.0 and my knowledge of the Rationality community, I am quite confident that Leverage was not the parasitic bad actor that you are looking for, but I think this is something the Rationality community should determine for itself and this seems like a fine time to do so.
However, I would also like to note that Leverage 1.0 has historically been on the receiving end of substantial levels of bullying, harassment, needless cruelty, public ridicule, and more by people who were not engaged in any legitimate epistemic activity. I do not think this is OK. I intend to call out this behavior directly when I see it. I would ask that others do so as well.
(I currently work at Leverage research but did not work at Leverage during Leverage 1.0 (although I interacted with Leverage 1.0 and know many of the people involved). Before working at Leverage I did EA community building at CEA between Summer 2014 and early 2019.)
I think there’s actually been a whole lot of discourse and thought about Are Rationalists A Cult, focusing on some of this same stuff? I think the most reasonable and true answers to this are generally along the lines of “the word ‘cult’ bundles together some weird but neutral stuff and some legitimately concerning stuff and some actually horrifying stuff, and rationalists-as-a-whole do some of the weird neutral stuff and occasionally (possibly more often than population baseline but not actually that often) veer into the legitimately concerning stuff and do not really do the actually horrifying stuff”. This post, as I read it, is making the case that Leverage veered far more strongly into the “legitimately concerning” region of cult-adjacent space, and perhaps made contact with “actually horrifying”-space.
Notably out of your examples, some are actually bad imo? “Hero-worship of Eliezer” is imo bad, and also happily is not really much of a thing in at least the parts of ratspace I hang out in; “the tendency of rationalists to hang out with only other rationalists” is I think also not great and I think if taken to an extreme would be a pretty worrying sign, but in fact most rationalists I know do maintain social ties (including close ones) outside this group.
Unusual rationalist psychological techniques span a pretty wide range, and I have sometimes heard descriptions of such techniques/practices/dynamics and been wary or alarmed, and talked to other rationalists who had similar reactions (which I say not to invoke the authority of an invisible crowd that agrees with me but to note that rationalists do sometimes have negative “immune” responses to practices invented by other rationalists even if they’re not associated with a specific disliked subgroup). Sort of similarly re: “take over the world plan”, I do not really know enough about any specific person or group’s AI-related aspirations to say how fair a summary that is, but… I think the more a fair summary it is, the more potentially worrying that is?
Which is to say, I do think that there are pretty neutral aspects of rationalist community (the group houses, the weird ingroup jargon, the enthusiasm for making everything a ritual) that may trip people’s “this makes me think of cults” flag but are not actually worrying, but I don’t think this means that rationalists should turn off their, uh, cult-detectors? Central-examples-of-cults do actually cause harm, and we do actually want to avoid those failure modes.
There is a huge difference between “tendency to hang out with other Rationalists” and having mandatory therapy sessions with your supervisor or having to ask for permission to write a personal blog.
Yeah, ‘cult’ is a vague term often overused. Yeah, a lot of rationality norms can be viewed as cultish.
How would you suggest referring to an ‘actual’ cult—or, if you prefer not to use that term at all, how would you suggest we describe something like scientology or nxivm? Obviously those are quite extreme, but I’m wondering if there is ‘any’ degree of group-controlling traits that you would be comfortable assigning the word cult to? Or if I refer to scientology as a cult, do you consider this an out-grouping power move used to distance people from scientology’s perspective?
This strikes me as an obviously good question and I’m surprised it hasn’t been answered.
No. As demonstrated by this comment by Viliam, the word “cult” refers is a well-defined set of practices used to break people’s ability to think rationally. Leverage does not deny using these practices. To the contrary, it appears flagrantly indifferent to the abuse potential. Cult techniques of brainwashing an attractor of human social behavior. Eliezer Yudkowsky warned about this attractor. Your attempt to redefine cult more broadly is a signal you’re bullshitting us.
It’s useful to be able to conceptualise something that is 50% or 90% of the way to becoming a cult, because then you can jump off.
Leverage is not doing everything that Viliam described in his post.
Your mind belongs to the group : In the description above there’s no mention of people needing to confess sins.
A sacred science : Leverage did not have an intellectual environment that didn’t allow for doubts.
Map over the territory : There’s no assertion of that in the common knowledge facts and I doubt it’s true for Leverage.
They call it “Belief Reporting”, it’s described in one of the documents that were removed from Internet Archive. The members are (were?) supposed to do it regularly with their manager. That is like “auditing” in Scientology, except instead of using an e-meter they rely on nerds being pathologically honest.
There’s no inherent need to confess having violated any rules and comitted sins in belief reporting.
It’s a debugging technique and while you can use any debugging technique to debug someone having comitted sins no one here who has closer information about leverage charged that they do that.
Scientology actually does force people to confess sins when they commit what they consider ethics violations (scientology calls their code of conduct ethics).
Anyone involved in scientology would easily classify what scientology does as including a need to confess sins. On the other hand, that’s far how the participants of belief reporting sessions at Leverage likely thought about it. At the moment there’s no source that anybody in Leverage got an impression that this is what happened to them.
It’s quite toxic for rational discussion to make those accusations instead of focus on the facts that are actually out in the open.
What’s the content of belief reporting?
I learned belief reporting from a person who attended a Leverage workshop and haven’t had any direct face-to-face exposure to Leverage.
Belief reporting is a debugging technique. You have a personal issue you want to address. Then you look at related beliefs.
Leverage found that if someone sets an intention of “I will tell the truth” and then speaks out of a belief like “I’m a capable person” and they don’t believe that (at a deep level), they will have a physical sensation of resistance.
Afterwards, there’s an attempt to trace the belief to it’s roots. The person can then speak out various forms of “I’m not a capable person because X” and “I’m not a capable person because Y”. Then recursively the process gets applied to seek for the root. Often that allows uncovering that there’s some confusion at the base of the belief and then after having uncovered the confusion it’s possible to work the tree up again to get rid of the “I’m not a capable person” belief and switch it into “I’m a capable person”.
This often leads to discovering that one holds beliefs at a deep level that one’s system II considers silly but that still are the base of other beliefs and that affect our actions.
Thanks for the description!
In my opinion, this sounds interesting as a confidential voluntary therapy, but Orwellian when:
So, your supervisor is debugging your beliefs, possibly related to your job performance, and you are supposed to not only tell the truth, but also “seek for the root”… and yet, in your opinion, this does not imply “having to confess violation of the rules or committed sins”?
What exactly happens when you start having doubts about the organization or the leader, and as a result your job performance drops, and then you are having the session with your manager? Would you admit, truthfully, “you know, recently I started having some doubts about whether we are really doing our best to improve the world, or just using the effective altruist community as a finshing pond for people who are idealistic and willing to sacrifice… and I guess these thoughts distract me from my tasks”, and then your therapist/manager is going to say… what?
Nothing written above suggests that doubt about central strategy would have been seen as sin, especially when it isn’t necessarily system II endorsed. It’s my understanding that talking about the theories of change through which Leverage is going to have an effect on the world was one of the main activities Leverage engaged in.
Besides the word sin is generally about taking actions that are in violation of norms of an organization. In the Scientology context it’s for example a sin to watch a documentary about Scientology on normal TV. In Christianity masturbation would be a sin.
Leverage doesn’t have a similar behavior codex that declares certain actions as sins that have to be confessed.
Role conflicts between being a manager and a therapist can easily produce problems but analysing them through a frame as it being about “confessing sins” is not an useful lense to think coherently about the involved problems.
Interesting, thanks!
You missed the part where this person was pointing out that there is Deliberately Vague Language used by the OP. Imo, this language doesn’t create enough of a structure for commenters to construct an adequate dialogue about several sub-topics in this thread.
Also, what’s “flagrantly indifferent” about Larissa wanting to hear out people who feel wronged?
You seem to be quite upset by all of this, why not reach out and let her know?
Nah, he’s alright. If someone calls a cult a cult, that’s not a reason to call them upset. Plus, he writes about plenty of other things; you’re the one with the new account made only to defend Leverage.
The social pressure against defending Leverage is in the air, so anonymity shouldn’t be held against someone who does that, it’s already bad enough that there is a reason for anonymity.
If questioning the “rationality” of the discourse is defending them, then what do you suppose you’re doing?
I just don’t see the goals or values of this community reflected here and it confuses me. That’s why I made this account—to get clarity on what seems to me to be a total anomaly case in how the rationalist community members (at least as far as signaling goes, I guess) conduct themselves.
Because I’ve only seen what is classifiable as a hysteric response to this topic, the Leverage topic.