Eliezer, sure, but that can’t be the whole story. I don’t care about some of the stuff most people care about. Other people whose utility functions differ in similar but different ways from the social norm are called “psychopaths”, and most people think they should either adopt their morals or be removed from society. I agree with this.
So why should I make a special exception for myself, just because that’s who I happen to be? I try to behave as if I shared common morals, but it’s just a gross patch. It feels tacked on, and it is.
I expected (though I had no idea how) you’d come up with an argument that would convice me to fully adopt such morals. But what you said would apply to any utility function. If a paperclip maximizer wondered about morality, you could tell it: “‘Good’ means ‘maximizes paperclips’. You can think about it all day long, but you’d just end up making a mistake. Is that worth forsaking the beauty of tiling the universe with paperclips? What do you care there exists in mindspace minds that drag children off train tracks?” and it’d work just as well. Yet if you could, I bet you’d choose to make the paperclip maximizer adopt your morals.
Eliezer, sure, but that can’t be the whole story. I don’t care about some of the stuff most people care about. Other people whose utility functions differ in similar but different ways from the social norm are called “psychopaths”, and most people think they should either adopt their morals or be removed from society. I agree with this.
So why should I make a special exception for myself, just because that’s who I happen to be? I try to behave as if I shared common morals, but it’s just a gross patch. It feels tacked on, and it is.
I expected (though I had no idea how) you’d come up with an argument that would convice me to fully adopt such morals. But what you said would apply to any utility function. If a paperclip maximizer wondered about morality, you could tell it: “‘Good’ means ‘maximizes paperclips’. You can think about it all day long, but you’d just end up making a mistake. Is that worth forsaking the beauty of tiling the universe with paperclips? What do you care there exists in mindspace minds that drag children off train tracks?” and it’d work just as well. Yet if you could, I bet you’d choose to make the paperclip maximizer adopt your morals.