Well, this is insanely disappointing. Yes, the OP shouldn’t have directly replied to the Bankless podcast like that, but it’s not like he didn’t read your List of Lethalities, or your other writing on AGI risk. You really have no excuse for brushing off very thorough and honest criticism such as this, particularly the sections that talk about alignment.
And as others have noted, Eliezer Yudkowsky, of all people, complaining about a blog post being long is the height of irony.
This is coming from someone who’s mostly agreed with you on AGI risk since reading the Sequences, years ago, and who’s donated to MIRI, by the way.
On the bright side, this does make me (slightly) update my probability of doom downwards.
Well, this is insanely disappointing. Yes, the OP shouldn’t have directly replied to the Bankless podcast like that, but it’s not like he didn’t read your List of Lethalities, or your other writing on AGI risk. You really have no excuse for brushing off very thorough and honest criticism such as this, particularly the sections that talk about alignment.
And as others have noted, Eliezer Yudkowsky, of all people, complaining about a blog post being long is the height of irony.
This is coming from someone who’s mostly agreed with you on AGI risk since reading the Sequences, years ago, and who’s donated to MIRI, by the way.
On the bright side, this does make me (slightly) update my probability of doom downwards.