Changing one’s goal is in general not a very useful thing, as if you change your (terminal) goal, you are less likely to achieve that (original) goal.
I was thinking here of the sort of changing utility function that Eliezer talks about in Coherent Extrapolated Volition.
I was thinking here of the sort of changing utility function that Eliezer talks about in Coherent Extrapolated Volition.