My argument is that, like humanity, a superintelligent AI will initially find it easier to extract resources from Earth than it will from space based sources. By the time earth’s resources are sufficiently depleted that this is no longer the case, there will be far too little remaining for humanity to survive on.
That’s obviously false from any vaguely rigorous take.
What is obviously true: the ASI could take 99% of the earth’s raw materials, and 100% of the rest of the solar system, and leave plenty for the current human population to survive, assuming MNT.
If an AI is capable of taking 99% of the resources that humans rely on to live, it’s capable of taking 100%.
Tell me why the AI should stop at 99% (or 85%, or 70%, or whatever threshold you wish to draw) without having that threshold encoded as one of its goals.
My argument is that, like humanity, a superintelligent AI will initially find it easier to extract resources from Earth than it will from space based sources. By the time earth’s resources are sufficiently depleted that this is no longer the case, there will be far too little remaining for humanity to survive on.
That’s obviously false from any vaguely rigorous take.
What is obviously true: the ASI could take 99% of the earth’s raw materials, and 100% of the rest of the solar system, and leave plenty for the current human population to survive, assuming MNT.
If an AI is capable of taking 99% of the resources that humans rely on to live, it’s capable of taking 100%.
Tell me why the AI should stop at 99% (or 85%, or 70%, or whatever threshold you wish to draw) without having that threshold encoded as one of its goals.
Because it has to have extremely advanced cognition or we would have won in our conflicts. It may see some value in not murdering it’s creators.