Cool, I just wrote a post with an orthogonal take on the same issue. Seems like Eliezer’s nanotech comment was pretty polarizing. Self promoting...Pitching an Alignment Softball
I worry that the global response would be impotent even if the AGI was sandboxed to twitter. Having been through the pandemic, I perceive at least the United States’ political and social system to be deeply vulnerable to the kind of attacks that would be easiest for an AGI—those requiring no physical infrastructure.
This does not directly conflict with or even really address your assertion that we’ll all be around in 30 years. It seems like you were very focused here on a timeline for actual extinction. I guess I’m looking for a line to draw about “when will unaligned AGI make life no longer worth living, or at least destroy our ability to fight it?” I find this a much more interesting question, because at that point it doesn’t matter if we have a month or 30 years left—we’re living in caves on borrowed time.
My expectation is that we don’t even need AGI or superintelligence, because unaligned humans are going to provide the intelligence part. The missing doomsday ingredient is ease of attack, which is getting faster, better, and cheaper every year.
Cool, I just wrote a post with an orthogonal take on the same issue. Seems like Eliezer’s nanotech comment was pretty polarizing. Self promoting...Pitching an Alignment Softball
I worry that the global response would be impotent even if the AGI was sandboxed to twitter. Having been through the pandemic, I perceive at least the United States’ political and social system to be deeply vulnerable to the kind of attacks that would be easiest for an AGI—those requiring no physical infrastructure.
This does not directly conflict with or even really address your assertion that we’ll all be around in 30 years. It seems like you were very focused here on a timeline for actual extinction. I guess I’m looking for a line to draw about “when will unaligned AGI make life no longer worth living, or at least destroy our ability to fight it?” I find this a much more interesting question, because at that point it doesn’t matter if we have a month or 30 years left—we’re living in caves on borrowed time.
My expectation is that we don’t even need AGI or superintelligence, because unaligned humans are going to provide the intelligence part. The missing doomsday ingredient is ease of attack, which is getting faster, better, and cheaper every year.