I cannot understand why any of these would cause an AI to change their goals.
My best guess at your argument is that you are referring to something different from the consensus use of the word ‘goals’ here. Most of the people debating you are using goals to refer to terminal values, not instrumental ones. (‘Goal’ is somewhat misleading here; ‘value’ might be more accurate.)
I cannot understand why any of these would cause an AI to change their goals.
My best guess at your argument is that you are referring to something different from the consensus use of the word ‘goals’ here. Most of the people debating you are using goals to refer to terminal values, not instrumental ones. (‘Goal’ is somewhat misleading here; ‘value’ might be more accurate.)
Nah, I’m fine with replacing “goals” with “terminal values” in my argument.
I still see no law of nature or logic that would prevent an AI from changing its terminal values as it develops.