Some of risks are “instrumental risks” like “the use of human atoms”, and other are “final goal risks”, like “cover universe with smily faces”. If final goal is something like smily faces, the AI can still preserve some humans for instrumental goals, like research the types of smiles or trade with aliens.
if some humans are preserved instrumentally, they could live better lives than we now and even be more numerous, so it is not extinction risk. Most humans who live now here are instrumental to states and corporations, but still get some reward.
Some of risks are “instrumental risks” like “the use of human atoms”, and other are “final goal risks”, like “cover universe with smily faces”. If final goal is something like smily faces, the AI can still preserve some humans for instrumental goals, like research the types of smiles or trade with aliens.
if some humans are preserved instrumentally, they could live better lives than we now and even be more numerous, so it is not extinction risk. Most humans who live now here are instrumental to states and corporations, but still get some reward.