I think that there may be a failure-to-communicate going on because I play Rationalist’s Taboo with words like ‘should’ and ‘right’ when I’m not talking about something technical. In my mind, these words assert the existence of an objective morality, so I wouldn’t feel comfortable using them unless everyone’s utility functions converged to the same morality—this seems really really unlikely so far.
So, instead I talk about world-states that my utility function assigns utility to. What I think that Eliezer’s trying to get at in No License To Be Human is that you shouldn’t (for the sake of not creating rendering your stated utility function inconsistent with your emotions) be a moral relativist, and that you should pursue your utility function instead of wireheading your brain to make it feel like you’re creating utility.
I think that I’ve interpreted this correctly, but I’d appreciate Eliezer telling me whether I have or not.
Could you link me to the post of Eliezer’s that you disagree with on this? I’d like to see it.
This comment, as I wrote here. I don’t understand this post.
I think that there may be a failure-to-communicate going on because I play Rationalist’s Taboo with words like ‘should’ and ‘right’ when I’m not talking about something technical. In my mind, these words assert the existence of an objective morality, so I wouldn’t feel comfortable using them unless everyone’s utility functions converged to the same morality—this seems really really unlikely so far.
So, instead I talk about world-states that my utility function assigns utility to. What I think that Eliezer’s trying to get at in No License To Be Human is that you shouldn’t (for the sake of not creating rendering your stated utility function inconsistent with your emotions) be a moral relativist, and that you should pursue your utility function instead of wireheading your brain to make it feel like you’re creating utility.
I think that I’ve interpreted this correctly, but I’d appreciate Eliezer telling me whether I have or not.