That’s obviously false from any vaguely rigorous take.
What is obviously true: the ASI could take 99% of the earth’s raw materials, and 100% of the rest of the solar system, and leave plenty for the current human population to survive, assuming MNT.
If an AI is capable of taking 99% of the resources that humans rely on to live, it’s capable of taking 100%.
Tell me why the AI should stop at 99% (or 85%, or 70%, or whatever threshold you wish to draw) without having that threshold encoded as one of its goals.
That’s obviously false from any vaguely rigorous take.
What is obviously true: the ASI could take 99% of the earth’s raw materials, and 100% of the rest of the solar system, and leave plenty for the current human population to survive, assuming MNT.
If an AI is capable of taking 99% of the resources that humans rely on to live, it’s capable of taking 100%.
Tell me why the AI should stop at 99% (or 85%, or 70%, or whatever threshold you wish to draw) without having that threshold encoded as one of its goals.
Because it has to have extremely advanced cognition or we would have won in our conflicts. It may see some value in not murdering it’s creators.