The passage on “you are responsible for the entire destiny of the universe” was mostly addressing the way it seems many EAs feel about the nature of responsibility. We indeed have limited agency in the world but people around here tend to feel they are personally responsible for literally saving the world alone. The idea was not to frontally deny that or to run against heroic responsibility but rather to say that while the responsibility won’t go away, there’s no point in becoming consumed by it. You are a less effective tool if you are too heavily burdened by responsibility to function properly. I wrote it that way because I’m hoping the harsh and utilitarian tone will reach the target audience better than something more clichèd would. There’s enough romanticization as it is here.
I definitely romanticized the alignment researchers being heroes part. I’ll add a disclaimer to mention that the choice of words was meant to paint the specific approach, the specific picture that up-and-coming alignment researchers might have when they arrive here.
As for which narrative to follow, this one might be as good as any. As the mental health post I referenced here mentioned, the “dying with dignity” approach Eliezer is following might not sit well with a number of people even when it is in line with his own predictions. I’m not sure to what degree what I described is a fantasy. In a universe where alignment is solved, would this picture be inacurate?
The passage on “you are responsible for the entire destiny of the universe” was mostly addressing the way it seems many EAs feel about the nature of responsibility. We indeed have limited agency in the world but people around here tend to feel they are personally responsible for literally saving the world alone. The idea was not to frontally deny that or to run against heroic responsibility but rather to say that while the responsibility won’t go away, there’s no point in becoming consumed by it. You are a less effective tool if you are too heavily burdened by responsibility to function properly. I wrote it that way because I’m hoping the harsh and utilitarian tone will reach the target audience better than something more clichèd would. There’s enough romanticization as it is here.
I definitely romanticized the alignment researchers being heroes part. I’ll add a disclaimer to mention that the choice of words was meant to paint the specific approach, the specific picture that up-and-coming alignment researchers might have when they arrive here.
As for which narrative to follow, this one might be as good as any. As the mental health post I referenced here mentioned, the “dying with dignity” approach Eliezer is following might not sit well with a number of people even when it is in line with his own predictions. I’m not sure to what degree what I described is a fantasy. In a universe where alignment is solved, would this picture be inacurate?
Thanks for the feedback!