I feel like the linked post is extolling the virtue of something that is highly unproductive and self-destructive: using your internal grim-o-meter to measure the state of the world/future. As Nate points out in his post, this is a terrible idea. Maybe Musk can be constantly grim while being productive on AI Alignment, but from my experience, people constantly weighted down by the shit that happens don’t do creative research—they get depressed and angsty. Even if they do some work, they burnout way more often.
That being said, I agree that it makes sense for people really involved in this topic to freak out from time to time (happens to me). But I don’t want to make freaking out the thing that every Alignment researcher feels like they have to signal.
I feel like the linked post is extolling the virtue of something that is highly unproductive and self-destructive: using your internal grim-o-meter to measure the state of the world/future. As Nate points out in his post, this is a terrible idea. Maybe Musk can be constantly grim while being productive on AI Alignment, but from my experience, people constantly weighted down by the shit that happens don’t do creative research—they get depressed and angsty. Even if they do some work, they burnout way more often.
That being said, I agree that it makes sense for people really involved in this topic to freak out from time to time (happens to me). But I don’t want to make freaking out the thing that every Alignment researcher feels like they have to signal.