A runaway AI might wind up being very destructive, but quite probably not wholly destructive.
It seems likely that it would find some of the knowledge humanity has built up over the
millenia useful, regardless of what specific goals it had. In that sense, I think that even if
a paperclip optimizer is built and eats the world, we won’t have been wholly forgotten in
the way we would if, e.g. the sun exploded and vaporized our planet. I don’t find this to be
much comfort, but how comforting or not it is is a matter of personal taste.
A runaway AI might wind up being very destructive, but quite probably not wholly destructive. It seems likely that it would find some of the knowledge humanity has built up over the millenia useful, regardless of what specific goals it had. In that sense, I think that even if a paperclip optimizer is built and eats the world, we won’t have been wholly forgotten in the way we would if, e.g. the sun exploded and vaporized our planet. I don’t find this to be much comfort, but how comforting or not it is is a matter of personal taste.