(I reject the notion that one can factorize intelligence from goals, so that one could take a superintelligence and fuse it with a goal to optimize for paperclips.
Why would you believe that? Evolution was more than capable of building an intelligence that optimized for whatever goals it needed, notable reproduction and personal survival. Granted its version was imperfect, since humans have enough conflicting goals that we can sometimes make moves that are objectively bad for the perpetuation of our gametes, not to mention the obvious failure cases like asexuals. That said, evolution has fat fingers. We can do better, and any AI’s we build will be able to do even better.
I promise you that if the production of paperclips was a survival trait in the ancestral environment, above all else, we would all be paperclip maximizers. We would consider paperclips profound and important, and we would be loathe to remove the desire to make paperclips—any more than we would be inclined now to pare out our own sex drive and self-preservation instinct.
EDIT: I do think the scenario of simply immediately turning everything into paperclips is naive. A superintelligence would have an enormous incentive to devote its resources to research and development for optimizing its goals as rapidly as possible, and would probably spend a lot of time simply thinking before actually embarking on a large-scale manufacture of paperclips. That’s still not good for us, though, because even in that case, we’re clearly a lot more useful to it as solid-state paperclip R&D labs than as human beings.
Why would you believe that? Evolution was more than capable of building an intelligence that optimized for whatever goals it needed, notable reproduction and personal survival. Granted its version was imperfect, since humans have enough conflicting goals that we can sometimes make moves that are objectively bad for the perpetuation of our gametes, not to mention the obvious failure cases like asexuals. That said, evolution has fat fingers. We can do better, and any AI’s we build will be able to do even better.
I promise you that if the production of paperclips was a survival trait in the ancestral environment, above all else, we would all be paperclip maximizers. We would consider paperclips profound and important, and we would be loathe to remove the desire to make paperclips—any more than we would be inclined now to pare out our own sex drive and self-preservation instinct.
EDIT: I do think the scenario of simply immediately turning everything into paperclips is naive. A superintelligence would have an enormous incentive to devote its resources to research and development for optimizing its goals as rapidly as possible, and would probably spend a lot of time simply thinking before actually embarking on a large-scale manufacture of paperclips. That’s still not good for us, though, because even in that case, we’re clearly a lot more useful to it as solid-state paperclip R&D labs than as human beings.