Thanks for clarifying, I didn’t get this from a comment about the timelines.
“insanity” refers to the situation where humanity allows AI labs to race ahead, hoping they’ll solve alignment on the way. I’m pretty sure that if the race isn’t stopped, everyone will die once the first smart enough AI is launched.
Is this “extreme” because everyone dies, or because I’m confident this is what happens?
Thanks for clarifying, I didn’t get this from a comment about the timelines.
“insanity” refers to the situation where humanity allows AI labs to race ahead, hoping they’ll solve alignment on the way. I’m pretty sure that if the race isn’t stopped, everyone will die once the first smart enough AI is launched.
Is this “extreme” because everyone dies, or because I’m confident this is what happens?