The standard argument you will probably listen is that AGI will be capable of killing everyone because they can think so much faster than humans. I haven’t seen yet a serious engagement from doomers to the argument of capabilities. I agree with everything you said here and to me these arguments are obviously right.
The arguments do seem right. But they eat away at the edges of AGI x-risk arguments, without addressing the core arguments for massive risks. I accept the argument that doom isn’t certain, that takeoff won’t be that fast, and that we’re likely to get warning shots. We’re still likely to ultimately be eliminated if we don’t get better technical and societal alignment solutions relatively quickly.
I guess the crux here for most people is the timescale. I agree actually that things can get eventually very bad if there is no progress in alignment etc, but the situation is totally different if we have 50 or 70 years to work on that problem or, as Yudkowsky keeps repeating, we don’t have that much time because AGI will kill us all as soon as it appears.
The standard argument you will probably listen is that AGI will be capable of killing everyone because they can think so much faster than humans. I haven’t seen yet a serious engagement from doomers to the argument of capabilities. I agree with everything you said here and to me these arguments are obviously right.
The arguments do seem right. But they eat away at the edges of AGI x-risk arguments, without addressing the core arguments for massive risks. I accept the argument that doom isn’t certain, that takeoff won’t be that fast, and that we’re likely to get warning shots. We’re still likely to ultimately be eliminated if we don’t get better technical and societal alignment solutions relatively quickly.
I guess the crux here for most people is the timescale. I agree actually that things can get eventually very bad if there is no progress in alignment etc, but the situation is totally different if we have 50 or 70 years to work on that problem or, as Yudkowsky keeps repeating, we don’t have that much time because AGI will kill us all as soon as it appears.