It would never willingly or knowingly change its goals from its original ones … Goals are static.
AIs of the required caliber do not exist (yet). Therefore we cannot see the territory, all we are doing is using our imagination to draw maps which may or may not resemble the future territory.
These maps (or models) are based on certain assumptions. In this particular case your map assumes that AI goals are immutable. That is an assumption of this particular map/model, it does not derive from any empirical reality.
If you want to argue that in your map/model of an AI the goals are immutable, fine. However they are immutable because you assumed them so and for no other reason.
If you want to argue that in reality the AI’s goals are immutable because there is a law of nature or logic or something else that requires it—show me the law.
AIs of the required caliber do not exist (yet). Therefore we cannot see the territory, all we are doing is using our imagination to draw maps which may or may not resemble the future territory.
These maps (or models) are based on certain assumptions. In this particular case your map assumes that AI goals are immutable. That is an assumption of this particular map/model, it does not derive from any empirical reality.
If you want to argue that in your map/model of an AI the goals are immutable, fine. However they are immutable because you assumed them so and for no other reason.
If you want to argue that in reality the AI’s goals are immutable because there is a law of nature or logic or something else that requires it—show me the law.