People should be able to think the world is at risk, or even doomed, and that solving that is (naturally) something like the most pressing problem, without being whatever it means to be “apocalyptic death cultists”. Otherwise you run into foreseeable issues like new technologies actually presenting existential risks and nobody taking taking those risks seriously for social reasons.
Lmao, I’m the author of the top level comment, and I honestly came to the same conclusion about this place being a cult (or at the very least, cult-like with a TON of red flags). I recently started lurking here again after not thinking about this group for months because I was curious how the rationalists were rationalizing the FTX collapse and also wanted to see their reaction to that openai chat thing.
Because it is an apocalyptic cult and don’t you dare let them pretend otherwise.
People should be able to think the world is at risk, or even doomed, and that solving that is (naturally) something like the most pressing problem, without being whatever it means to be “apocalyptic death cultists”. Otherwise you run into foreseeable issues like new technologies actually presenting existential risks and nobody taking taking those risks seriously for social reasons.
Lmao, I’m the author of the top level comment, and I honestly came to the same conclusion about this place being a cult (or at the very least, cult-like with a TON of red flags). I recently started lurking here again after not thinking about this group for months because I was curious how the rationalists were rationalizing the FTX collapse and also wanted to see their reaction to that openai chat thing.