Although under strict preference utilitarianism, wouldn’t change in values/moral progress be considered bad, for the same reason a paperclip maximizer would consider it bad?
Although under strict preference utilitarianism, wouldn’t change in values/moral progress be considered bad, for the same reason a paperclip maximizer would consider it bad?