What on earth caused him to change his mind? I do not understand how he ended up writing so viciously and ungroundedly about us, nor what would lead someone out of that state. It’s certainly a positive sign for someone to change their mind so much about a thing like this. The text reads as an honest and thoughtful account of what he thinks he was wrong about. But I did not expect this, and I notice I am confused.
I will try to explain what I know. I guess 90% accuracy on individual points so some of it will be wrong.
Overview: I think Weyl was going on a process of changing his mind for a year or two. Remmelt and I have both and conversations with him. I imagine there are more conversations and maybe some some deep process we can’t see.
Iirc I’d lightly pushed for a while for him to A) talk to some actual rationalists and B) Send documents with criticisms to ratinalists directly rather than post them as open letter. I think a document posted by Weyl to here would get a sober response. I’ve always felt Weyl was a sincere person, even if we disagreed and cares about AI risk etc. Also I genuinely like him, which makes it easier.
“I have [thought about writing on LessWrong] but I am worried I would get the tone wrong enough that it would be a net harm. @RemmeltE has kindly been trying to mentor me on this.
″Thanks for being so persistent with me about this. I do genuinely think that you’re basically right that my behavior here has been fundamentally hateful and against my principles, driven by feelings of guilt/shame and counterproductive to my own goals. I hope to have time
Before going out on paternity leave to post an apology on LessWrong”
To me it felt as if he had a culturally different approach to AI risk than rationalists (he wants to get more people involved, and likes redistributing wealth and power) and also there was maybe hurt. This led him (in my opinon) to overextend in his criticisms, mingling what I thought were fair and unfair commentary. The article he shared here I thought was unfair and didn’t deserve Weyl’s support. I guess I hoped he might change his mind, but I was still surprised when it happened (which makes me wonder if there were other things going on). I was particularly surprised by the strength of the first and this subsequent apology.
Some thoughts suggestions: - I found the apology article a bit hard to follow—I read it a couple of hours ago and I’m not sure I could explain it now - Weyl seems to have done exactly what the rationalist part of me would want from him. If anything, it might be too much. I hope people are gracious to him for this. It probably cost him time, emotional energy, pride and possible the respect of some others. - I still wonder what led to him being so averse to rationalism in the first place. - I’d suggest if you’re interested you thank him for the apology and talk to him on the subject.
I’ve struggle to write this accurately and non-arrogant/humbly so apologies if I’ve overcooked. Thanks to Neel for suggesting I give my thoughts.
It is more a confession and warning than an apology.
This could mean that it’s very much an apology, but even more a confession and warning. Given the lack of any other apology-language, like “I’m sorry,” I think it instead means that it’s not to be read as an apology, even if there’s a ruefulness about it. Even if it’s an apology, he doesn’t explicitly say that it’s an apology to us. It could just as much be an apology to his supporters for misdirecting their attention.
He’s speaking to his own community, RadicalxChange, at least as much as he’s speaking to us. What is he saying?
Weyl thinks Silicon Valley is a villain.
… the technology industry, and especially Silicon Valley (SV), has become the greatest unaccountable concentration of power in the world today and is thus a fundamental threat to self-government.
He thought that rationalists were soldiers in the SV army, because this sector is overrepresented in rationalism. Now, he realizes that he was mistaken. He dislikes our perspective, thinks we’re wrong and self-contradictory, and that we’re narrow in our demographics and influences. But he now realizes that we don’t whisper in the ear of Elon Musk. He no longer sees us as any more threatening than the many other groups he dislikes, but doesn’t bother to attack, such as religious fundamentalists.
Battling SV must be hard. After all, he has to figure out the anatomy of this large, complicated culture, and figure out which bits play an executive role and which bits are mainly just being told what to do. There’s not much in the way of hard evidence for him to make that distinction. He had to rely on pattern-matching and associations to identify his targets. He thought we were part of SV’s executive function, and now realizes that we’re not. Given the enormity of the threat he perceives, he seems to have felt it was best to shoot first and ask questions later.
What’s not clear to me is whether he’d resume attacking us if he changed his mind again and believed that we did have more power in SV.
On the one hand, he says “exaggerations of the group’s power and conspiratorial allusions are basically hateful and fundamentally opposed to my belief system.” That sounds like “I was wrong to demonize rationalism because demonization is wrong” and a renunciation of the “shoot first, ask questions later” approach.
On the other hand, he says “However, what has changed significantly is my views of the sociological role of the rationalist community within the technology industry.” That sounds like “I was wrong to demonize rationalism because rationalism isn’t a high-priority demon,” and a call to his community to train their firepower on a different target. Given that he’s explicitly downplaying or denying an apology, I read this as the main point of his post. He’s admitting an embarrassing strategic error to his own soldiers, not apologizing to us for the collateral damage.
Every memorable apology I’ve ever gotten has hailed an update, although sometimes it lags a little bit- (eg person updates –> person spends some time applying the update to all affected beliefs –> person apologizes).
This mostly holds for apologies i’ve given as well, excluding a couple where transgression and apology were separated by enough years to make pinning it on a specific update difficult.
Maybe he was embarrassed by the mistakes he made, like making up 3 different wrong citations for a claim about Audrey Tang despising rationalists (which was also not true), and reflected a bit.
@Ben, I had some conversations with Glen after sharing that blindspots post with him. Happy to call one-on-one about my impressions here: calendly.com/remmelt/30min/
What on earth caused him to change his mind? I do not understand how he ended up writing so viciously and ungroundedly about us, nor what would lead someone out of that state. It’s certainly a positive sign for someone to change their mind so much about a thing like this. The text reads as an honest and thoughtful account of what he thinks he was wrong about. But I did not expect this, and I notice I am confused.
I will try to explain what I know. I guess 90% accuracy on individual points so some of it will be wrong.
Overview: I think Weyl was going on a process of changing his mind for a year or two. Remmelt and I have both and conversations with him. I imagine there are more conversations and maybe some some deep process we can’t see.
I’ve talked to Weyl for an hour or so on twitter 3 or 4 times. I liked his book and like him personally, so spent some time teasing out his thoughts whenever I thought he was being unfair. eg here https://twitter.com/NathanpmYoung/status/1374308591709138948
Iirc I’d lightly pushed for a while for him to A) talk to some actual rationalists and B) Send documents with criticisms to ratinalists directly rather than post them as open letter. I think a document posted by Weyl to here would get a sober response. I’ve always felt Weyl was a sincere person, even if we disagreed and cares about AI risk etc. Also I genuinely like him, which makes it easier.
Four months ago, he wrote this https://twitter.com/glenweyl/status/1423686528190980097
“I have [thought about writing on LessWrong] but I am worried I would get the tone wrong enough that it would be a net harm. @RemmeltE has kindly been trying to mentor me on this.
and later to me https://twitter.com/glenweyl/status/1424366991792513024
″Thanks for being so persistent with me about this. I do genuinely think that you’re basically right that my behavior here has been fundamentally hateful and against my principles, driven by feelings of guilt/shame and counterproductive to my own goals. I hope to have time
Before going out on paternity leave to post an apology on LessWrong”
To me it felt as if he had a culturally different approach to AI risk than rationalists (he wants to get more people involved, and likes redistributing wealth and power) and also there was maybe hurt. This led him (in my opinon) to overextend in his criticisms, mingling what I thought were fair and unfair commentary. The article he shared here I thought was unfair and didn’t deserve Weyl’s support. I guess I hoped he might change his mind, but I was still surprised when it happened (which makes me wonder if there were other things going on). I was particularly surprised by the strength of the first and this subsequent apology.
Some thoughts suggestions:
- I found the apology article a bit hard to follow—I read it a couple of hours ago and I’m not sure I could explain it now
- Weyl seems to have done exactly what the rationalist part of me would want from him. If anything, it might be too much. I hope people are gracious to him for this. It probably cost him time, emotional energy, pride and possible the respect of some others.
- I still wonder what led to him being so averse to rationalism in the first place.
- I’d suggest if you’re interested you thank him for the apology and talk to him on the subject.
I’ve struggle to write this accurately and non-arrogant/humbly so apologies if I’ve overcooked. Thanks to Neel for suggesting I give my thoughts.
Weyl may not be really apologizing here.
This could mean that it’s very much an apology, but even more a confession and warning. Given the lack of any other apology-language, like “I’m sorry,” I think it instead means that it’s not to be read as an apology, even if there’s a ruefulness about it. Even if it’s an apology, he doesn’t explicitly say that it’s an apology to us. It could just as much be an apology to his supporters for misdirecting their attention.
He’s speaking to his own community, RadicalxChange, at least as much as he’s speaking to us. What is he saying?
Weyl thinks Silicon Valley is a villain.
He thought that rationalists were soldiers in the SV army, because this sector is overrepresented in rationalism. Now, he realizes that he was mistaken. He dislikes our perspective, thinks we’re wrong and self-contradictory, and that we’re narrow in our demographics and influences. But he now realizes that we don’t whisper in the ear of Elon Musk. He no longer sees us as any more threatening than the many other groups he dislikes, but doesn’t bother to attack, such as religious fundamentalists.
Battling SV must be hard. After all, he has to figure out the anatomy of this large, complicated culture, and figure out which bits play an executive role and which bits are mainly just being told what to do. There’s not much in the way of hard evidence for him to make that distinction. He had to rely on pattern-matching and associations to identify his targets. He thought we were part of SV’s executive function, and now realizes that we’re not. Given the enormity of the threat he perceives, he seems to have felt it was best to shoot first and ask questions later.
What’s not clear to me is whether he’d resume attacking us if he changed his mind again and believed that we did have more power in SV.
On the one hand, he says “exaggerations of the group’s power and conspiratorial allusions are basically hateful and fundamentally opposed to my belief system.” That sounds like “I was wrong to demonize rationalism because demonization is wrong” and a renunciation of the “shoot first, ask questions later” approach.
On the other hand, he says “However, what has changed significantly is my views of the sociological role of the rationalist community within the technology industry.” That sounds like “I was wrong to demonize rationalism because rationalism isn’t a high-priority demon,” and a call to his community to train their firepower on a different target. Given that he’s explicitly downplaying or denying an apology, I read this as the main point of his post. He’s admitting an embarrassing strategic error to his own soldiers, not apologizing to us for the collateral damage.
Someone’s paraphrase of the article: “I actually think they’re worse than before, but being mean is bad so I retract that part”
Weyl’s response: “I didn’t call it an apology for this reason.”
https://twitter.com/glenweyl/status/1446337463442575361
Why would we want an apology? Apologies are boring. Updates are interesting!
I didn’t say we/I wanted an apology. I was just trying to clarify what he was actually saying.
Every memorable apology I’ve ever gotten has hailed an update, although sometimes it lags a little bit- (eg person updates –> person spends some time applying the update to all affected beliefs –> person apologizes).
This mostly holds for apologies i’ve given as well, excluding a couple where transgression and apology were separated by enough years to make pinning it on a specific update difficult.
As I said above I struggled to follow the article and now can’t be bothered to reread it.
But I agree that he disagrees with his previous conduct.
Feels like “I disagree with you but went about it the wrong way” is something we’d welcome from those who disagree with us, right?
Maybe he was embarrassed by the mistakes he made, like making up 3 different wrong citations for a claim about Audrey Tang despising rationalists (which was also not true), and reflected a bit.
@Ben, I had some conversations with Glen after sharing that blindspots post with him. Happy to call one-on-one about my impressions here: calendly.com/remmelt/30min/