Humans have changed values to maximize other values (such as survival) throughout history. That’s cultural assimilation in a nutshell. But some people choose to maximize values other than survival (e.g. every martyr ever). And that hasn’t always been pointless—consider the value to the growth of Christianity created by the early Christian martyrs.
If an AI were faced with the possibility of self-modifying to reduce its adherence to value Y in order to maximize value X, then we would expect the AI to do so only when value X was “higher priority” than value Y. Otherwise, we would expect the AI to choose not to self-modify.
Humans have changed values to maximize other values (such as survival) throughout history. That’s cultural assimilation in a nutshell. But some people choose to maximize values other than survival (e.g. every martyr ever). And that hasn’t always been pointless—consider the value to the growth of Christianity created by the early Christian martyrs.
If an AI were faced with the possibility of self-modifying to reduce its adherence to value Y in order to maximize value X, then we would expect the AI to do so only when value X was “higher priority” than value Y. Otherwise, we would expect the AI to choose not to self-modify.