But also, I think pretty close to ZERO people who were deeply affected (aside from Zoe, who hasn’t engaged beyond the post) have come forward in this thread. And I… guess we should talk about that.
I know from firsthand, that there were some pretty bad experiences in the incident that tore Leverage 1.0 apart, which nobody appears to feel able to talk about.
I am currently not at all optimistic that we’re managing to balance this correctly? I also want this to go right. I’m not quite sure how to do it.
That’s pretty fair. I am open to taking down this comment, or other comments I’ve made. (Not deleting them forever, I’ll save them offline or something.) Your feedback is helpful here and revealing to me, and I feel myself updating because of it.
I have commented somewhere else that I do not like LessWrong for this discussion… because a) It seems bad for justice to be served. and b) It removes a bunch of context data that I personally think is super relevant (including emotional, physical layers) and c) LW is absolutely not a place designed for healing or reconciliation… and it also seems only ‘okay’ for sense-making as a community. It is maybe better for sense-making at the individual intellectual level. So… I guess LW isn’t my favorite place for this discussion to be happening… I wonder what you think.
(Separately) I care about folks from Leverage. I am very fond of the ones I’ve met. Zoe charted me once, and I feel fondly about that. I’ve been charted a number of times at Leverage, and it was good, and I personally love CT charting / Belief Reporting and use, reference, and teach it to others to this day. Although it’s my own version now. I went to a Paradigm workshop once, as well as several parties or gatherings.
My felt sense of my time at the workshop (especially during more casual hang-out-y parts of it) is like a sense of sad distance… like, oh I would like to be friends with these people… but mentally / emotionally they seem “hard to access.”
I’m feeling compassion towards the ones who have suffered and are suffering. I don’t need to be personal friends with anyone, but … if there’s a way I can be of service, I am interested.
Open and free invitation: If anyone involved in the Leverage stuff in some way wants someone to hold space for you as you process things, I am open to offer that, over Zoom, in a confidential manner. (I am not very involved in the community normally, as I am committed to being at the Monastic Academy in Vermont for a long while, and I don’t engage in divisive / gossipy speech. It is wrong speech :P) Cat would probably vouch for me. But basically uhh, even if what you want to say would normally be totally crazy to most rationalists or even most Westerners, I have ventured so far outside the overton window that I doubt I’ll be taken aback. If that helps. :P
Since it’s mostly just pointers to stuff I’ve already said/implied… I’ll throw out a quick comment.
I would like it if somebody started something like a carefully-moderated private Facebook group, mostly of core people who were there, to come to grips with their experiences? I think this could be good.
I am slightly concerned that people who are still in the grips of “Leverage PR campaigning” tendencies, will start trying to take it over or otherwise poison the well? (Edit: Or conversely, that people who still feel really hurt or confused about it might lash out more than I’d wish. I personally, am more worried about the former.) I still think it might be good, overall.
Be sure to be clear EARLY about who you are inviting, and who you are excluding! It changes what people are willing to talk about.
...I am not personally the right person to do this, though.
(It is too easy to “other” me, if that makes sense.)
I feel like one of the only things the public LW thread could do here?
Is ensuring public awareness of some of the unreasonably-strong reality/truth-suppressive pressures that were at play here, that there were some ways in which secrecy agreements were leveraged pretty badly to avoid accountability for harms, and showing a public ramp-down of opportunities to do so in the future.
Along with doing what we can, to signal that we generally stand against people over-simplistically demonizing the people and organizations involved in this.
… unreasonably-strong reality/truth-suppressive pressures that were at play here, that there were some ways in which secrecy agreements were leveraged pretty badly to avoid accountability for harms …
Hmm. This seems worth highlighting.
The NDAs (plus pressure to sign) point to this.
…
( The rest of this might be triggering to anyone who’s been through gaslighting / culty experiences. Blunt descriptions of certain forms of control and subjugation. )
...
The rest of the truth-suppressive measures I can only speculate. Here’s a list of possible speculative mechanisms that come to mind, some of which were corroborated by Zoe’s report but not all:
Group hazing or activities that cause collective shame, making certain things hard to admit to oneself and others (plus, inserting a bucket error where ‘shameful activity’ is bucketed with ‘the whole project’ or something)
This could include implanting group delusions that are shameful to admit.
Threats to one’s physical person or loved ones for revealing things
Threats to one’s reputation or ability to acquire resources for revealing things
Deprivation used to negatively / positively reinforce certain behaviors or stories (“well, if you keep talking like that, we’re gonna have to take your phone / food / place to sleep”)
Gaslighting specific individuals or subgroups (“what you’re experiencing is in your own head; look at other people, they are doing fine, stop being crazy / stop ruining the vibe / stop blocking the project”)
A lot of things could fit into this category.
Causing dissociation. (Thus disconnecting a person from their true yes/no or making it harder for them to discern truth from fiction.) This is very common among modern humans, though, and doesn’t seem as evil-sounding as the other examples. Modern humans are already very dissociated afaict.
It would become more evil if it was intentionally exploited or amplified.
Dissociation could be generalized or selective. Selective seems more problematic because it could be harder to detect.
Pretending there is common knowledge or an obvious norm around what should be private / confidential, when there is not. (There is some of this going around rationalist spaces already.) “Don’t talk about X behind their back, that’s inappropriate.” or “That’s their private business, stay out of it.” <-- Said in situations where it’s not actually inappropriate or when claims of it being someone’s ‘private business’ is overreaching.
Deliberately introducing and enforcing a norm of privacy or confidentiality that breaks certain normal and healthy social accountability structures. (Compassionate gossip is healthy in groups, especially those living in residential community,. Rationalists seem to not get this though and tend to break Chesterton’s fence on this, but I attribute this to hubris. It seems worse to me if these norms are introduced out of self-serving fear.)
Sexual harassment, molestation, or assault. (This tends to result in silencing pretty effectively.)
Creating internal jockeying, using an artificial scarcity around status or other resources. A culture of oneupmanship. A culture of having to play ‘loyal’. People getting way too sucked into this game and having their motives hijacked. They internally align themselves with the interests of certain leaders or the group, leading to secrecy being part of their internal motivation system.
This one is really speculative, but if I imagine buying into the story that Geoff is like, a superintelligence basically, and can somehow predict my own thoughts and moves before I can, then … maybe I get paranoid about even having thoughts that go against (my projection of) his goals.
Basically, if I thought someone could legit read my mind and they were not-compassionate or if I thought that they could strategically outmaneuver me at every turn due to their overwhelming advantage, that might cause some fucked up stuff in my head that stays in there for a while.
“You can’t rely on your perspective / Everything is up for grabs.” All of your mental content—ideas, concepts, motions, etc.--are potentially good (and should be leaned more heavily on, overriding others) / bad (and should be ignored / downvoted / routed around / destroyed / pushed against), and more openness to change is better, and there’s no solid place from which you can stand and see things. Of course, this is in many ways true and useful; but leaning into this creates much more room for others to selectively up/downvote stuff in you to avoid you reaching conclusions they don’t want you to reach; or more likely, up/downvote conclusions, and have you rearrange yourself to harmonize with those judgements.
Trolling Hope placed in the project / leadership. Like: I care deeply that things go well in the world; the only way I concretely see that might happen, is through this project; so if this project is doomed, then there’s no Hope; so I may as well bet everything on worlds where the project isn’t doomed; so worlds where the project is doomed are irrelevant; so I don’t see / consider / admit X if X implies that the project is doomed, since X is entirely about irrelevant worlds.
Emotional reward conditioning. (This one is simple or obvious, but I think it’s probably actually a significant portion of many of these sorts of situations.) When you start to say information I don’t like, I’m angry at you, annoyed, frustrated, dismissive, scornful, derisive, insulting, blank-faced, uninterested, condescending, disgusted, creeped out, pained, hurt, etc. When you start to hide information I don’t like, or expound the opposite, I’m pleasant, endeared, happy, admiring, excited, etc. etc. Conditioning shades into + overlaps other tactics like stonewalling (blank-faced, aiming at learned helplessness), shaming, and running intereference (changing the subject), but conditioning has a particular systematic effect of making you “walk on eggshells” about certain things and feeling relief / safety when you stick to appropriate narratives. And this systematic effect can be very strong and persist even when you’re away from the people who put it there, if you didn’t perfectly compartmentalize how-to-please-them from everything else in your mind.
Do you have a suggestion for another forum that you think would be better?
In particular, do you have pointers to online forums that do incorporate the emotional and physical layers (“in a non-toxic way”, he adds, thinking of twitter). Or do you think that the best way to do this is just not online at all?
CFAR’s recent staff reunion seemed to do all right. It wasn’t, like, optimized for safety or making sure everyone was heard equally or something like that, but such features could be added if desired. Having skilled third-party facilitators seemed good.
Oh you said ‘online’. Uhhh.
Online fishbowl Double Cruxes would get us like … 30% of the way there maybe? Private / invite only ones?
One could run an online Group Process like thing too. Invite a group of people into a Zoom call, and facilitate certain breakout sessions? Ideally with facilitation in each breakout group?
I am not thinking very hard about it.
We need a lot of skill points in the community to make such things go well. I’m not sure how many skill points we’re at.
Meta: I think it makes some good points. I do not think it was THAT bad, and I think the discussion was good. I would keep it up, but it’s your call. Possibly adding an “Edit: (further complicated thoughts)” at the top? (Respect for thinking about it, though.)
I basically agree with this.
But also, I think pretty close to ZERO people who were deeply affected (aside from Zoe, who hasn’t engaged beyond the post) have come forward in this thread. And I… guess we should talk about that.
I know from firsthand, that there were some pretty bad experiences in the incident that tore Leverage 1.0 apart, which nobody appears to feel able to talk about.
I am currently not at all optimistic that we’re managing to balance this correctly? I also want this to go right. I’m not quite sure how to do it.
That’s pretty fair. I am open to taking down this comment, or other comments I’ve made. (Not deleting them forever, I’ll save them offline or something.) Your feedback is helpful here and revealing to me, and I feel myself updating because of it.
I have commented somewhere else that I do not like LessWrong for this discussion… because a) It seems bad for justice to be served. and b) It removes a bunch of context data that I personally think is super relevant (including emotional, physical layers) and c) LW is absolutely not a place designed for healing or reconciliation… and it also seems only ‘okay’ for sense-making as a community. It is maybe better for sense-making at the individual intellectual level. So… I guess LW isn’t my favorite place for this discussion to be happening… I wonder what you think.
(Separately) I care about folks from Leverage. I am very fond of the ones I’ve met. Zoe charted me once, and I feel fondly about that. I’ve been charted a number of times at Leverage, and it was good, and I personally love CT charting / Belief Reporting and use, reference, and teach it to others to this day. Although it’s my own version now. I went to a Paradigm workshop once, as well as several parties or gatherings.
My felt sense of my time at the workshop (especially during more casual hang-out-y parts of it) is like a sense of sad distance… like, oh I would like to be friends with these people… but mentally / emotionally they seem “hard to access.”
I’m feeling compassion towards the ones who have suffered and are suffering. I don’t need to be personal friends with anyone, but … if there’s a way I can be of service, I am interested.
Open and free invitation: If anyone involved in the Leverage stuff in some way wants someone to hold space for you as you process things, I am open to offer that, over Zoom, in a confidential manner. (I am not very involved in the community normally, as I am committed to being at the Monastic Academy in Vermont for a long while, and I don’t engage in divisive / gossipy speech. It is wrong speech :P) Cat would probably vouch for me. But basically uhh, even if what you want to say would normally be totally crazy to most rationalists or even most Westerners, I have ventured so far outside the overton window that I doubt I’ll be taken aback. If that helps. :P
You can FB msg me or gmail me (unrealeel).
Since it’s mostly just pointers to stuff I’ve already said/implied… I’ll throw out a quick comment.
I would like it if somebody started something like a carefully-moderated private Facebook group, mostly of core people who were there, to come to grips with their experiences? I think this could be good.
I am slightly concerned that people who are still in the grips of “Leverage PR campaigning” tendencies, will start trying to take it over or otherwise poison the well? (Edit: Or conversely, that people who still feel really hurt or confused about it might lash out more than I’d wish. I personally, am more worried about the former.) I still think it might be good, overall.
Be sure to be clear EARLY about who you are inviting, and who you are excluding! It changes what people are willing to talk about.
...I am not personally the right person to do this, though.
(It is too easy to “other” me, if that makes sense.)
I feel like one of the only things the public LW thread could do here?
Is ensuring public awareness of some of the unreasonably-strong reality/truth-suppressive pressures that were at play here, that there were some ways in which secrecy agreements were leveraged pretty badly to avoid accountability for harms, and showing a public ramp-down of opportunities to do so in the future.
Along with doing what we can, to signal that we generally stand against people over-simplistically demonizing the people and organizations involved in this.
Hmm. This seems worth highlighting.
The NDAs (plus pressure to sign) point to this.
…
( The rest of this might be triggering to anyone who’s been through gaslighting / culty experiences. Blunt descriptions of certain forms of control and subjugation. )
...
The rest of the truth-suppressive measures I can only speculate. Here’s a list of possible speculative mechanisms that come to mind, some of which were corroborated by Zoe’s report but not all:
Group hazing or activities that cause collective shame, making certain things hard to admit to oneself and others (plus, inserting a bucket error where ‘shameful activity’ is bucketed with ‘the whole project’ or something)
This could include implanting group delusions that are shameful to admit.
Threats to one’s physical person or loved ones for revealing things
Threats to one’s reputation or ability to acquire resources for revealing things
Deprivation used to negatively / positively reinforce certain behaviors or stories (“well, if you keep talking like that, we’re gonna have to take your phone / food / place to sleep”)
Gaslighting specific individuals or subgroups (“what you’re experiencing is in your own head; look at other people, they are doing fine, stop being crazy / stop ruining the vibe / stop blocking the project”)
A lot of things could fit into this category.
Causing dissociation. (Thus disconnecting a person from their true yes/no or making it harder for them to discern truth from fiction.) This is very common among modern humans, though, and doesn’t seem as evil-sounding as the other examples. Modern humans are already very dissociated afaict.
It would become more evil if it was intentionally exploited or amplified.
Dissociation could be generalized or selective. Selective seems more problematic because it could be harder to detect.
Pretending there is common knowledge or an obvious norm around what should be private / confidential, when there is not. (There is some of this going around rationalist spaces already.) “Don’t talk about X behind their back, that’s inappropriate.” or “That’s their private business, stay out of it.” <-- Said in situations where it’s not actually inappropriate or when claims of it being someone’s ‘private business’ is overreaching.
Deliberately introducing and enforcing a norm of privacy or confidentiality that breaks certain normal and healthy social accountability structures. (Compassionate gossip is healthy in groups, especially those living in residential community,. Rationalists seem to not get this though and tend to break Chesterton’s fence on this, but I attribute this to hubris. It seems worse to me if these norms are introduced out of self-serving fear.)
Sexual harassment, molestation, or assault. (This tends to result in silencing pretty effectively.)
Creating internal jockeying, using an artificial scarcity around status or other resources. A culture of oneupmanship. A culture of having to play ‘loyal’. People getting way too sucked into this game and having their motives hijacked. They internally align themselves with the interests of certain leaders or the group, leading to secrecy being part of their internal motivation system.
This one is really speculative, but if I imagine buying into the story that Geoff is like, a superintelligence basically, and can somehow predict my own thoughts and moves before I can, then … maybe I get paranoid about even having thoughts that go against (my projection of) his goals.
Basically, if I thought someone could legit read my mind and they were not-compassionate or if I thought that they could strategically outmaneuver me at every turn due to their overwhelming advantage, that might cause some fucked up stuff in my head that stays in there for a while.
If this resonates with you, I am very sorry.
I welcome additions to this list.
“You can’t rely on your perspective / Everything is up for grabs.” All of your mental content—ideas, concepts, motions, etc.--are potentially good (and should be leaned more heavily on, overriding others) / bad (and should be ignored / downvoted / routed around / destroyed / pushed against), and more openness to change is better, and there’s no solid place from which you can stand and see things. Of course, this is in many ways true and useful; but leaning into this creates much more room for others to selectively up/downvote stuff in you to avoid you reaching conclusions they don’t want you to reach; or more likely, up/downvote conclusions, and have you rearrange yourself to harmonize with those judgements.
Trolling Hope placed in the project / leadership. Like: I care deeply that things go well in the world; the only way I concretely see that might happen, is through this project; so if this project is doomed, then there’s no Hope; so I may as well bet everything on worlds where the project isn’t doomed; so worlds where the project is doomed are irrelevant; so I don’t see / consider / admit X if X implies that the project is doomed, since X is entirely about irrelevant worlds.
Emotional reward conditioning. (This one is simple or obvious, but I think it’s probably actually a significant portion of many of these sorts of situations.) When you start to say information I don’t like, I’m angry at you, annoyed, frustrated, dismissive, scornful, derisive, insulting, blank-faced, uninterested, condescending, disgusted, creeped out, pained, hurt, etc. When you start to hide information I don’t like, or expound the opposite, I’m pleasant, endeared, happy, admiring, excited, etc. etc. Conditioning shades into + overlaps other tactics like stonewalling (blank-faced, aiming at learned helplessness), shaming, and running intereference (changing the subject), but conditioning has a particular systematic effect of making you “walk on eggshells” about certain things and feeling relief / safety when you stick to appropriate narratives. And this systematic effect can be very strong and persist even when you’re away from the people who put it there, if you didn’t perfectly compartmentalize how-to-please-them from everything else in your mind.
Do you have a suggestion for another forum that you think would be better?
In particular, do you have pointers to online forums that do incorporate the emotional and physical layers (“in a non-toxic way”, he adds, thinking of twitter). Or do you think that the best way to do this is just not online at all?
CFAR’s recent staff reunion seemed to do all right. It wasn’t, like, optimized for safety or making sure everyone was heard equally or something like that, but such features could be added if desired. Having skilled third-party facilitators seemed good.
Oh you said ‘online’. Uhhh.
Online fishbowl Double Cruxes would get us like … 30% of the way there maybe? Private / invite only ones?
One could run an online Group Process like thing too. Invite a group of people into a Zoom call, and facilitate certain breakout sessions? Ideally with facilitation in each breakout group?
I am not thinking very hard about it.
We need a lot of skill points in the community to make such things go well. I’m not sure how many skill points we’re at.
Meta: I think it makes some good points. I do not think it was THAT bad, and I think the discussion was good. I would keep it up, but it’s your call. Possibly adding an “Edit: (further complicated thoughts)” at the top? (Respect for thinking about it, though.)