One distinction I want to make here is between people who are really excited to work on AI Alignment (or, any particular high-impact-career), and who are motivated to stick with it for years (but who don’t seem sufficiently competent), vs people who are doing it out of a vague sense of obligation, don’t feel excited (and don’t seem sufficiently competent).
For the first group, a) I can imagine them improving over time, b) if they’re excited about it and find it fulfilling, like, great! It’s the second group I feel most worried about, and I really worry about the vague existential angst driving people to throw themselves into careers they aren’t actually well suited for. (and for creative research I suspect you do need a degree of enthusiasm in order to make it work)
I think this distinction is very important. In my experiences EAs/Rationalists tend to underestimate the impact of personal fit; if you’re completely unexcited and doing things only out of a vague sense of obligation, it’s likely that the job just isn’t for you, regardless of your level of competence.
people who are doing it out of a vague sense of obligation
I want to to put a bit of concreteness on this vague sense of obligation, because it doesn’t actually seem that vague at all, it seems like a distinct set of mental gears, and the mental gears are just THE WORLD WILL STILL BURN and YOU ARE NOT GOOD ENOUGH.
If you earnestly believe that there is a high chance of human extinction and the destruction of everything of value in the world, then it probably feels like your only choices are to try preventing that regardless of pain or personal cost, or to gaslight yourself into believing it will all be okay.
“I want to take a break and do something fun for myself, but THE WORLD WILL STILL BURN. I don’t know if I’m a good enough AI researcher, but if I go do any other things to help the world but we don’t solve AI then THE WORLD WILL STILL BURN and render everything else meaningless.”
The doomsday gauge is 2 minutes to midnight, and sure, maybe you won’t succeed in moving the needle much or at all, and maybe doing that will cost you immensely, but given that the entire future is gated behind doomsday not happening, the only thing that actually matters in the world is moving that needle and anything else you could be doing is a waste of time, a betrayal of the future and your values. So people get stuck in a mindset of “I have to move the needle at all costs and regardless of personal discomfort or injury, trying to do anything else is meaningless because THE WORLD WILL STILL BURN so there’s literally no point.”
So you have a bunch of people who get themselves worked up and thinking that any time they spend on not saving the world is a personal failure, the stakes are too high to take a day off to spend time with your family, the stakes! The stakes! The stakes!
And then locking into that gear to make a perfect soul crushing trap, is YOU ARE NOT GOOD ENOUGH. Knowing you aren’t Eliezer Yudkowsky or Nick Bostrom and never will be, you’re just fundamentally less suited to this project and should do something else with your life to improve the world. Don’t distract the actually important researchers or THE WORLD WILL BURN.
So on one hand you have the knowledge that THE WORLD WILL BURN and you probably can’t do anything about it unless you throw your entire life into and jam your whole body into the gears, and on the other hand you have the knowledge that YOU AREN’T GOOD ENOUGH to stop it. How can you get good enough to stop the world from burning? Well first, you sacrifice everything else you value in life to Moloch, then you throw yourself into the gears and have a psychotic break.
One distinction I want to make here is between people who are really excited to work on AI Alignment (or, any particular high-impact-career), and who are motivated to stick with it for years (but who don’t seem sufficiently competent), vs people who are doing it out of a vague sense of obligation, don’t feel excited (and don’t seem sufficiently competent).
For the first group, a) I can imagine them improving over time, b) if they’re excited about it and find it fulfilling, like, great! It’s the second group I feel most worried about, and I really worry about the vague existential angst driving people to throw themselves into careers they aren’t actually well suited for. (and for creative research I suspect you do need a degree of enthusiasm in order to make it work)
I think this distinction is very important. In my experiences EAs/Rationalists tend to underestimate the impact of personal fit; if you’re completely unexcited and doing things only out of a vague sense of obligation, it’s likely that the job just isn’t for you, regardless of your level of competence.
I want to to put a bit of concreteness on this vague sense of obligation, because it doesn’t actually seem that vague at all, it seems like a distinct set of mental gears, and the mental gears are just THE WORLD WILL STILL BURN and YOU ARE NOT GOOD ENOUGH.
If you earnestly believe that there is a high chance of human extinction and the destruction of everything of value in the world, then it probably feels like your only choices are to try preventing that regardless of pain or personal cost, or to gaslight yourself into believing it will all be okay.
“I want to take a break and do something fun for myself, but THE WORLD WILL STILL BURN. I don’t know if I’m a good enough AI researcher, but if I go do any other things to help the world but we don’t solve AI then THE WORLD WILL STILL BURN and render everything else meaningless.”
The doomsday gauge is 2 minutes to midnight, and sure, maybe you won’t succeed in moving the needle much or at all, and maybe doing that will cost you immensely, but given that the entire future is gated behind doomsday not happening, the only thing that actually matters in the world is moving that needle and anything else you could be doing is a waste of time, a betrayal of the future and your values. So people get stuck in a mindset of “I have to move the needle at all costs and regardless of personal discomfort or injury, trying to do anything else is meaningless because THE WORLD WILL STILL BURN so there’s literally no point.”
So you have a bunch of people who get themselves worked up and thinking that any time they spend on not saving the world is a personal failure, the stakes are too high to take a day off to spend time with your family, the stakes! The stakes! The stakes!
And then locking into that gear to make a perfect soul crushing trap, is YOU ARE NOT GOOD ENOUGH. Knowing you aren’t Eliezer Yudkowsky or Nick Bostrom and never will be, you’re just fundamentally less suited to this project and should do something else with your life to improve the world. Don’t distract the actually important researchers or THE WORLD WILL BURN.
So on one hand you have the knowledge that THE WORLD WILL BURN and you probably can’t do anything about it unless you throw your entire life into and jam your whole body into the gears, and on the other hand you have the knowledge that YOU AREN’T GOOD ENOUGH to stop it. How can you get good enough to stop the world from burning? Well first, you sacrifice everything else you value in life to Moloch, then you throw yourself into the gears and have a psychotic break.