I don’t have a ton of detail on exactly what happened in each of the cases where someone seemed to have a really bad time, but having looked into it for a few hours in each case, I think all three of them were in pretty close proximity to having spent a bunch of time (and in some of the cases after taking psychedelic drugs) with Michael.
Of the 4 hospitalizations and 1 case of jail time I know about, 3 of those hospitalized (including me) were talking significantly with Michael, and the others weren’t afaik (and neither were the 2 suicidal people), though obviously I couldn’t know about all conversations that were happening. Michael wasn’t talking much with Leverage people at the time.
I hadn’t heard of the statement about guillotines, that seems pretty intense.
I talked with someone recently who hadn’t been in the Berkeley scene specifically but who had heard that Michael was “mind-controlling” people into joining a cult, and decided to meet him in person, at which point he concluded that Michael was actually doing some of the unique interventions that could bring people out of cults, which often involves causing them to notice things they’re looking away from. It’s common for there to be intense psychological reactions to this (I’m not even thinking of the psychotic break as the main one, since that didn’t proximately involve Michael; there have been other conversations since then that have gotten pretty emotionally/psychologically intense), and that it’s common for people to not want to have such reactions, although clearly at least some people think they’re worth having for the value of learning new things.
IIRC the one case of jail time also had a substantial interaction with Michael relatively shortly before the psychotic break occurred. Though someone else might have better info here and should correct me if I am wrong. I don’t know of any 4th case, so I believe you that they didn’t have much to do with Michael. This makes the current record 4⁄5 to me, which sure seems pretty high.
Michael wasn’t talking much with Leverage people at the time.
I did not intend to indicate Michael had any effect on Leverage people, or to say that all or even a majority of the difficult psychological problems that people had in the community are downstream of Michael. I do think he had a large effect on some of the dynamics you are talking about in the OP, and I think any picture of what happened/is happening seems very incomplete without him and the associated social cluster.
I think the part about Michael helping people notice that they are in some kind of bad environment seems plausible to me, though doesn’t have most of my probability mass (~15%), and most of my probability mass (~60%) is indeed that Michael mostly just leverages the same mechanisms for building a pretty abusive and cult-like ingroup that are common, with some flavor of “but don’t you see that everyone else is completely crazy and evil” thrown into it.
I think it is indeed pretty common for abusive environments to start with “here is why your current environment is abusive in this subtle way, and that’s also why it’s OK for me to do these abusive-seeming things, because it’s not worse than anywhere else”. I think this was a really large fraction of what happened with Brent, and I also think a pretty large fraction of what happened with Leverage. I also think it’s a large fraction of what’s going on with Michael.
I do want to reiterate that I do assign substantial probability mass (~15%) to your proposed hypothesis being right, and am interested in more evidence for it.
IIRC the one case of jail time also had a substantial interaction with Michael relatively shortly before the psychotic break occurred
I was pretty involved in that case after the arrest and for several months after and spoke to MV about it, and AFAICT that person and Michael Vassar only met maybe once casually. I think he did spend a lot of time with others in MV’s clique though.
Ah, yeah, my model is that the person had spent a lot of time with MV’s clique, though I wasn’t super confident they had talked to Michael in particular. Not sure whether I would still count this as being an effect of Michael’s actions, seems murkier than I made it out to be in my comment.
I think one of the ways of disambiguating here is to talk to people outside your social bubble, e.g. people who live in different places, people with different politics, people in different subcultures or on different websites (e.g. Twitter or Reddit), people you run into in different contexts, people who have had experience in different mainstream institutions (e.g. different academic departments, startups, mainstream corporations). Presumably, the more of a culty bubble you’re in, the more prediction error this will generate, and the harder it will be establish communication protocols across the gap. This establishes a point of comparison between people in bubble A vs B.
I spent a long part of the 2020 quarantine period with Michael and some friends of his (and friends of theirs) who were previously in a non-bay-area cult, which exposed me to a lot of new perspectives I didn’t know about (not just theirs, but also those of some prison reform advocates and religious people), and made Michael seem less extremal or insular in comparison, since I wasn’t just comparing him to the bubble of people who I already knew about.
Hmm, I’ve tried to read this comment for something like 5 minutes, but I can’t really figure out its logical structure. Let me give it a try in a more written format:
I think one of the ways of disambiguating here
Presumably this is referring to distinguishing the hypothesis that Michael is kind of causing a bunch of cult-like problems, from the hypothesis that he helping people see problems that are actually present.
here is to talk to people outside your social bubble, e.g. people who live in different places, people with different politics, people in different subcultures or on different websites (e.g. Twitter), people you run into in different contexts, people who have had experience in different mainstream institutions. Presumably, the more of a culty bubble you’re in, the more prediction error this will generate, and the harder it will be establish communication protocols across the gap.
I don’t understand this part. Why would there be a monotonous relationship here? I agree with the bubble part, and while I expect there to be a vague correlation, it doesn’t feel like it measures anything like the core of what’s going on. I wouldn’t measure the cultishness of an economics department based on how good they are at talking to improv-students. It might still be good for them to get better at talking to improv students, but failure to do so doesn’t feel like particularly strong evidence to me (compared to other dimensions, like the degree to which they feel alienated from the rest of the world, or have psychotic breaks, or feel under a lot of social pressure to not speak out, or many other things that seem similarly straightforward to measure but feel like they get more at the core of the thing).
But also, I don’t understand how I am supposed to disambiguate things here? Like, maybe the hypothesis here is that by doing this myself I could understand how insular my own environment is? I do think that seems like a reasonable point of evidence, though I also think my experiences have been very different from people at MIRI or CFAR. I also generally don’t have a hard time establishing communication protocols across these kinds of gaps, as far as I can tell.
who were previously in a non-bay-area cult, which exposed me to a lot of new perspectives I didn’t know about (not just theirs, but also those of some prison reform advocates and religious people), and made Michael seem less extremal or insular in comparison, since I wasn’t just comparing him to the bubble of people who I already knew about.
This is interesting, and definitely some evidence, and I appreciate you mentioning it.
If you think the anecdote I shared is evidence, it seems like you agree with my theory to some extent? Or maybe you have a different theory for how it’s relevant?
E.g. say you’re an econ student, and there’s this one person in the econ department who seems to have all these weird opinions about social behavior and think body language is unusually important. Then you go talk to some drama students and find that they have opinions that are even more extreme in the same direction. It seems like the update you should make is that you’re in a more insular social context than the person with opinions on social behavior, who originally seemed to you to be in a small bubble that wasn’t taking in a lot of relevant information.
(basically, a lot of what I’m asserting constitutes “being in a cult” is living in a simulation of an artificially small, closed world)
The update was more straightforward, based on “I looked at some things that are definitely cults, what Michael does seems less extremal and insular in comparison, therefore it seems less likely for Michael to run into the same problems”. I don’t think that update required agreeing with your theory to any substantial degree.
I do think your paragraph still clarified things a bit for me, though with my current understanding, presumably the group to compare yourself against are less cults, and more just like, average people who are somewhat further out on some interesting dimension. And if you notice that average people seem really crazy and cult-like to you, then I do think this is something to pay attention to (though like, average people are also really crazy on lots of topics, like schooling and death and economics and various COVID related things that I feel pretty confident in, and so I don’t think this is some kind of knockdown argument, though I do think having arrived at truths that large fractions of the population don’t believe definitely increase the risks from insularity).
I definitely don’t want to imply that agreement with the majority is a metric, rather the ability to have a discussion at all, to be able to see part of the world they’re seeing and take that information into account in your own view (which might be called “interpretive labor” or “active listening”).
Agree. I do think the two are often kind of entwined (like, I am not capable of holding arbitrarily many maps of the world in my mind at the same time, so when I arrive at some unconventional beliefs that has broad consequences, the new models based on that belief will often replace more conventional models of the domain, and I will have to spend time regenerating the more conventional models and beliefs in conversation with someone who doesn’t hold the unconventional belief, which does frequently make the conversation kind of harder, and I still don’t think is evidence of something going terribly wrong)
Oh, something that might not have been clear is that talking with other people Michael knows made it clear that Michael was less insular than MIRI/CFAR people (who would have been less able to talk with such a diverse group of people, afaict), not just that he was less insular than people in cults.
Do you know if the 3 people who were talking significantly with Michael did LSD at the time or with him?
Erm… feel free to keep plausible deniability. Taking LSD seems to me like a pretty worthwhile thing to do in lots of contexts and I’m willing to put a substantial amount of resources to defending against legal attacks (or supporting you in the face of them) that are caused by you replying openly here. (I don’t know if that’s plausible, I’ve not thought about it much so mentioned it anyway.)
I had taken a psychedelic previously with Michael; one other person probably had; the other probably hadn’t; I’m quite unsure of the latter two judgments. I’m not going to disambiguate about specific drugs.
Of the 4 hospitalizations and 1 case of jail time I know about, 3 of those hospitalized (including me) were talking significantly with Michael, and the others weren’t afaik (and neither were the 2 suicidal people), though obviously I couldn’t know about all conversations that were happening. Michael wasn’t talking much with Leverage people at the time.
I hadn’t heard of the statement about guillotines, that seems pretty intense.
I talked with someone recently who hadn’t been in the Berkeley scene specifically but who had heard that Michael was “mind-controlling” people into joining a cult, and decided to meet him in person, at which point he concluded that Michael was actually doing some of the unique interventions that could bring people out of cults, which often involves causing them to notice things they’re looking away from. It’s common for there to be intense psychological reactions to this (I’m not even thinking of the psychotic break as the main one, since that didn’t proximately involve Michael; there have been other conversations since then that have gotten pretty emotionally/psychologically intense), and that it’s common for people to not want to have such reactions, although clearly at least some people think they’re worth having for the value of learning new things.
IIRC the one case of jail time also had a substantial interaction with Michael relatively shortly before the psychotic break occurred. Though someone else might have better info here and should correct me if I am wrong. I don’t know of any 4th case, so I believe you that they didn’t have much to do with Michael. This makes the current record 4⁄5 to me, which sure seems pretty high.
I did not intend to indicate Michael had any effect on Leverage people, or to say that all or even a majority of the difficult psychological problems that people had in the community are downstream of Michael. I do think he had a large effect on some of the dynamics you are talking about in the OP, and I think any picture of what happened/is happening seems very incomplete without him and the associated social cluster.
I think the part about Michael helping people notice that they are in some kind of bad environment seems plausible to me, though doesn’t have most of my probability mass (~15%), and most of my probability mass (~60%) is indeed that Michael mostly just leverages the same mechanisms for building a pretty abusive and cult-like ingroup that are common, with some flavor of “but don’t you see that everyone else is completely crazy and evil” thrown into it.
I think it is indeed pretty common for abusive environments to start with “here is why your current environment is abusive in this subtle way, and that’s also why it’s OK for me to do these abusive-seeming things, because it’s not worse than anywhere else”. I think this was a really large fraction of what happened with Brent, and I also think a pretty large fraction of what happened with Leverage. I also think it’s a large fraction of what’s going on with Michael.
I do want to reiterate that I do assign substantial probability mass (~15%) to your proposed hypothesis being right, and am interested in more evidence for it.
I was pretty involved in that case after the arrest and for several months after and spoke to MV about it, and AFAICT that person and Michael Vassar only met maybe once casually. I think he did spend a lot of time with others in MV’s clique though.
Ah, yeah, my model is that the person had spent a lot of time with MV’s clique, though I wasn’t super confident they had talked to Michael in particular. Not sure whether I would still count this as being an effect of Michael’s actions, seems murkier than I made it out to be in my comment.
I think one of the ways of disambiguating here is to talk to people outside your social bubble, e.g. people who live in different places, people with different politics, people in different subcultures or on different websites (e.g. Twitter or Reddit), people you run into in different contexts, people who have had experience in different mainstream institutions (e.g. different academic departments, startups, mainstream corporations). Presumably, the more of a culty bubble you’re in, the more prediction error this will generate, and the harder it will be establish communication protocols across the gap. This establishes a point of comparison between people in bubble A vs B.
I spent a long part of the 2020 quarantine period with Michael and some friends of his (and friends of theirs) who were previously in a non-bay-area cult, which exposed me to a lot of new perspectives I didn’t know about (not just theirs, but also those of some prison reform advocates and religious people), and made Michael seem less extremal or insular in comparison, since I wasn’t just comparing him to the bubble of people who I already knew about.
Hmm, I’ve tried to read this comment for something like 5 minutes, but I can’t really figure out its logical structure. Let me give it a try in a more written format:
Presumably this is referring to distinguishing the hypothesis that Michael is kind of causing a bunch of cult-like problems, from the hypothesis that he helping people see problems that are actually present.
I don’t understand this part. Why would there be a monotonous relationship here? I agree with the bubble part, and while I expect there to be a vague correlation, it doesn’t feel like it measures anything like the core of what’s going on. I wouldn’t measure the cultishness of an economics department based on how good they are at talking to improv-students. It might still be good for them to get better at talking to improv students, but failure to do so doesn’t feel like particularly strong evidence to me (compared to other dimensions, like the degree to which they feel alienated from the rest of the world, or have psychotic breaks, or feel under a lot of social pressure to not speak out, or many other things that seem similarly straightforward to measure but feel like they get more at the core of the thing).
But also, I don’t understand how I am supposed to disambiguate things here? Like, maybe the hypothesis here is that by doing this myself I could understand how insular my own environment is? I do think that seems like a reasonable point of evidence, though I also think my experiences have been very different from people at MIRI or CFAR. I also generally don’t have a hard time establishing communication protocols across these kinds of gaps, as far as I can tell.
This is interesting, and definitely some evidence, and I appreciate you mentioning it.
If you think the anecdote I shared is evidence, it seems like you agree with my theory to some extent? Or maybe you have a different theory for how it’s relevant?
E.g. say you’re an econ student, and there’s this one person in the econ department who seems to have all these weird opinions about social behavior and think body language is unusually important. Then you go talk to some drama students and find that they have opinions that are even more extreme in the same direction. It seems like the update you should make is that you’re in a more insular social context than the person with opinions on social behavior, who originally seemed to you to be in a small bubble that wasn’t taking in a lot of relevant information.
(basically, a lot of what I’m asserting constitutes “being in a cult” is living in a simulation of an artificially small, closed world)
The update was more straightforward, based on “I looked at some things that are definitely cults, what Michael does seems less extremal and insular in comparison, therefore it seems less likely for Michael to run into the same problems”. I don’t think that update required agreeing with your theory to any substantial degree.
I do think your paragraph still clarified things a bit for me, though with my current understanding, presumably the group to compare yourself against are less cults, and more just like, average people who are somewhat further out on some interesting dimension. And if you notice that average people seem really crazy and cult-like to you, then I do think this is something to pay attention to (though like, average people are also really crazy on lots of topics, like schooling and death and economics and various COVID related things that I feel pretty confident in, and so I don’t think this is some kind of knockdown argument, though I do think having arrived at truths that large fractions of the population don’t believe definitely increase the risks from insularity).
I definitely don’t want to imply that agreement with the majority is a metric, rather the ability to have a discussion at all, to be able to see part of the world they’re seeing and take that information into account in your own view (which might be called “interpretive labor” or “active listening”).
Agree. I do think the two are often kind of entwined (like, I am not capable of holding arbitrarily many maps of the world in my mind at the same time, so when I arrive at some unconventional beliefs that has broad consequences, the new models based on that belief will often replace more conventional models of the domain, and I will have to spend time regenerating the more conventional models and beliefs in conversation with someone who doesn’t hold the unconventional belief, which does frequently make the conversation kind of harder, and I still don’t think is evidence of something going terribly wrong)
Oh, something that might not have been clear is that talking with other people Michael knows made it clear that Michael was less insular than MIRI/CFAR people (who would have been less able to talk with such a diverse group of people, afaict), not just that he was less insular than people in cults.
Do you know if the 3 people who were talking significantly with Michael did LSD at the time or with him?
Erm… feel free to keep plausible deniability. Taking LSD seems to me like a pretty worthwhile thing to do in lots of contexts and I’m willing to put a substantial amount of resources to defending against legal attacks (or supporting you in the face of them) that are caused by you replying openly here. (I don’t know if that’s plausible, I’ve not thought about it much so mentioned it anyway.)
I had taken a psychedelic previously with Michael; one other person probably had; the other probably hadn’t; I’m quite unsure of the latter two judgments. I’m not going to disambiguate about specific drugs.