An AI with multiple conflicting goals sounds incoherent
Well humans exist despite having multiple conflicting goals.
The AI’s terminal goals are not guaranteed to be immutable. It is merely guaranteed that the AI will do its utmost to keep them unchanged, because that’s what terminal goals are. If it could desire to mutate them, then whatever was being mutated was not a terminal goal of the AI.
At this point, it’s not clear that the concept of “terminal goals” refers to anything in the territory.
Well humans exist despite having multiple conflicting goals.
At this point, it’s not clear that the concept of “terminal goals” refers to anything in the territory.