One challenge with the “fire alarm” analogy is that fires are something that just about everybody has a fair bit of tangible experience with. We’ve been burned, seen and perhaps built small fires, witnessed buildings on fire in the news, and perhaps know people who’ve lost their homes in a fire. Fires are very much real things to us.
AI singularity is different. Military AI technology and AI-generated propaganda or surveillance are from reference classes with which we have at least some longstanding, if often indirect experience. We understand the concept of the dangers of increasing military firepower. We have a sense of what it’s like to be spied on or lied to.
But the idea that a computer could become vastly more intelligent than we are, all of a sudden? No prior experience. A fire alarm should awaken your memories of fire and appropriate responses to them—all the different responses you list here. For most people, there’s nothing to awaken in their mind when it comes to AGI.
Honestly, it might be best if we milk the “apocalyptic climate change” metaphore harder. It seems like it’s the closest and most charged concept readily available in people’s minds to a slow-building catastrophe that could possibly threaten global disaster. It seems unlikely based on my reading that climate change actually threatens us with extinction, but connecting with that type of concern might be a place to start. Maybe when people think of AGI, we should encourage them to think less Terminator and more climate change.
One challenge with the “fire alarm” analogy is that fires are something that just about everybody has a fair bit of tangible experience with. We’ve been burned, seen and perhaps built small fires, witnessed buildings on fire in the news, and perhaps know people who’ve lost their homes in a fire. Fires are very much real things to us.
AI singularity is different. Military AI technology and AI-generated propaganda or surveillance are from reference classes with which we have at least some longstanding, if often indirect experience. We understand the concept of the dangers of increasing military firepower. We have a sense of what it’s like to be spied on or lied to.
But the idea that a computer could become vastly more intelligent than we are, all of a sudden? No prior experience. A fire alarm should awaken your memories of fire and appropriate responses to them—all the different responses you list here. For most people, there’s nothing to awaken in their mind when it comes to AGI.
Honestly, it might be best if we milk the “apocalyptic climate change” metaphore harder. It seems like it’s the closest and most charged concept readily available in people’s minds to a slow-building catastrophe that could possibly threaten global disaster. It seems unlikely based on my reading that climate change actually threatens us with extinction, but connecting with that type of concern might be a place to start. Maybe when people think of AGI, we should encourage them to think less Terminator and more climate change.