I like using the intuition pump of, AI : Humans :: Human : Apes. Imagine apes had the decision to create humans or not. They can sit there and argue about how humans will share ape values because they’re descended from apes. Or how humans pose an existential risk to apes or some such.
Humans may be dangerous because they’ll be smarter than us apes. Maybe humans will figure out how to get those bananas at the very top of the tree without risk of falling, then humans will have a massive advantage over apes. Maybe humans will better know how to hide from leopards; they’ll be able to hurt apes by attracting leopards to the colony and then hiding. Humans might be dangerous, but if we contain them or ensure that they share ape values then us apes will be better off.
And then humans take over the whole world and apes live in artificial habits entertaining us or survive in the wild only due to our mercy. We’re just too stupid to reasonably think of the ways AI will be able to defeat us. We’re sitting here with a boxed-AI thinking about the risk of nanotech while the AI is creating irl magic by warping the electric field of the world using just it’s transistors.
Like, we’re so stupid we don’t even know how to spontaneously generate biological life. The upper bound on intelligence is way above where we’re at now.
I like using the intuition pump of, AI : Humans :: Human : Apes. Imagine apes had the decision to create humans or not. They can sit there and argue about how humans will share ape values because they’re descended from apes. Or how humans pose an existential risk to apes or some such.
Humans may be dangerous because they’ll be smarter than us apes. Maybe humans will figure out how to get those bananas at the very top of the tree without risk of falling, then humans will have a massive advantage over apes. Maybe humans will better know how to hide from leopards; they’ll be able to hurt apes by attracting leopards to the colony and then hiding. Humans might be dangerous, but if we contain them or ensure that they share ape values then us apes will be better off.
And then humans take over the whole world and apes live in artificial habits entertaining us or survive in the wild only due to our mercy. We’re just too stupid to reasonably think of the ways AI will be able to defeat us. We’re sitting here with a boxed-AI thinking about the risk of nanotech while the AI is creating irl magic by warping the electric field of the world using just it’s transistors.
Like, we’re so stupid we don’t even know how to spontaneously generate biological life. The upper bound on intelligence is way above where we’re at now.