It’s a perfectly reasonable position when you consider that humanity is not going to survive long-term anyway. We’re either going extinct and leaving nothing behind, evolving into something completely new and alien, or getting destroyed by our intelligent creations. The first possibility is undesirable. The second and third are indistinguishable from the point of view of the present (if you assume that AI will be developed far enough into the future that no current humans will suffer any pain or sudden death because of it).
The questions asked there mostly seem basic and answered by some sequence or another. Maybe someone should make a post pointing out the most relevant sequences so those people can be thinking about the unsolved problems on the frontier?
Someone created an /r/controlproblem subreddit.
Actually very high quality subreddit. I’m impressed.
I never realized how many people there are who say “it’s a good thing if AI obliterates humanity, it deserves to live more than we do”.
On some level, the question really comes down to what kind of successors we want to create; they aren’t going to be us, either way.
That depends on whether you plan to die.
If I didn’t, the person I become ten thousand years from now isn’t going to be me; I will be at most a distant memory from a time long past.
It will still be more “me” than paperclips.
Than paperclips, yes. Than a paperclip optimizer?
Well… ten thousand years is a very, very long time.
It’s a perfectly reasonable position when you consider that humanity is not going to survive long-term anyway. We’re either going extinct and leaving nothing behind, evolving into something completely new and alien, or getting destroyed by our intelligent creations. The first possibility is undesirable. The second and third are indistinguishable from the point of view of the present (if you assume that AI will be developed far enough into the future that no current humans will suffer any pain or sudden death because of it).
You might still want your children to live rather than die.
The questions asked there mostly seem basic and answered by some sequence or another. Maybe someone should make a post pointing out the most relevant sequences so those people can be thinking about the unsolved problems on the frontier?
Great idea. I commission you for the task! (You might also succeed in collecting effective critiques of the sequences.)
If you post an article there, it is subtitled “self.ControlProblem”. Seems like many people there have a problem with self control. :D