In most such scenarios, the AI doesn’t have a terminal goal of getting rid of us, but rather have it as a subgoal that arises from some larger terminal goal.
Because that’s like winning the lottery. Of all the possible things it can do with the atoms that comprise you, few would involve keeping you alive, let alone living a life worth living.
So why not the opposite, why wouldn’t it have human intentions as a subgoal?
Because that’s like winning the lottery. Of all the possible things it can do with the atoms that comprise you, few would involve keeping you alive, let alone living a life worth living.