It’s to make money without breaking the law. An ASI that fuflfils both goals isn’t going to kill everybody, since murder is illegal. So even if you do have ASIs with stable long term goals, they don’t lead to doom. (It’s interesting to think of the chilling effect of a law that any human who creates an agentive AI is criminally responsible for what it does).
Most big companies don’t have the goal of making money without breaking the law but are often willing to break it as long as the punishment for breaking it isn’t too costly.
But even if the AGI doesn’t murder anyone in the first five year it operates it can still focus on acquiring resources and get untouchable from human actors and then engage in actions that lead to people dying. The holodomor wasn’t directly murder but people still died because they didn’t have food.
It’s to make money without breaking the law. An ASI that fuflfils both goals isn’t going to kill everybody, since murder is illegal. So even if you do have ASIs with stable long term goals, they don’t lead to doom. (It’s interesting to think of the chilling effect of a law that any human who creates an agentive AI is criminally responsible for what it does).
Most big companies don’t have the goal of making money without breaking the law but are often willing to break it as long as the punishment for breaking it isn’t too costly.
But even if the AGI doesn’t murder anyone in the first five year it operates it can still focus on acquiring resources and get untouchable from human actors and then engage in actions that lead to people dying. The holodomor wasn’t directly murder but people still died because they didn’t have food.