Completely agree. It’s more like a utility function for a really weird inhuman kind of agent. That agent finds it obvious that if you had a chance to painlessly kill all humans and replace them with aliens who are 50% happier and 50% more numerous it would be a wonderful and exiting opportunity. Like, it’s hard to overstate how weird utilitarianism is. And this agent will find it really painful and regretful to be confined by strategic considerations of “the humans would fight you really hard, so you should promise not to do it”. Where as humans find it relieving? or something.
Completely agree. It’s more like a utility function for a really weird inhuman kind of agent. That agent finds it obvious that if you had a chance to painlessly kill all humans and replace them with aliens who are 50% happier and 50% more numerous it would be a wonderful and exiting opportunity. Like, it’s hard to overstate how weird utilitarianism is. And this agent will find it really painful and regretful to be confined by strategic considerations of “the humans would fight you really hard, so you should promise not to do it”. Where as humans find it relieving? or something.
Utilitarianism indeed is just a very crude proxy.