Fair enough. If you have any insight into why your preferences rank in this way, I’d be interested, but I accept that they are what they are.
However, I’m now confused about your claim.
Are you saying that we ought to treat other people in accordance with your preferences of how to be treated (e.g., lied to in the present rather than having their values changed or their memories altered)? Or are you just talking about how you’d like us to treat you? Or are you assuming that other people have the same preferences you do?
For the preference ranking, I guess I can mathematically express it by saying that any priority change leads to me doing stuff that would be utility+ at the time, but utility- or utilityNeutral (and since I could be spending the time generating utility+ instead, even neutral is bad) now. For example, if I could change my utility function to eating babies, and babies were plentiful, this option would result in a huge source of utility+ after the change. Which doesn’t change the fact that it also means I’d eat a ton of babies, which makes the option a huge source of utility- currently—I wouldn’t want to do something that would lead to me eating a ton of babies. If I greatly valued generating as much utility+ for myself at any moment as possible, I would take the plunge; however, I look at the future, decide not to take what is currently utility- for me, and move on. Or maybe I’m just making up excuses to refuse to take a momentary discomfort for eternal utility+ - after all, I bet someone having the time of his life eating babies would laugh at me and have more fun than me—the inconsistency here is that I avoid the utility- choice when it comes to changing my terminal values, but I have no issue taking the utility- choice when I decide I want to be in a simulation. Guess I don’t value truth that much. I find that changing my memories leads to similar results as changing my utility function, but on a much, much smaller scale—after all, they are what make up my beliefs, preferences, myself as a person. Changing them at all changes my belief system and preference; but that’s happening all the time. Changing them on a large scale is significantly worse in terms of affecting my utility function—it can’t change my terminal values, so still far less bad than directly making me interested in eating babies, but still negative. Getting lied to is just bad, with no relation to the above two, and weakest in importance.
My gut says that I should treat others as I want them to treat me. Provided a simulation is a bit more awesome, or comparably awesome but more efficient, I’d rather take that than the real thing. Hence, I’d want to give others what I myself prefer (in terms of ways to achieve preferences) - not because they are certain to agree that being lied to is better than angsting about not helping people, but because my way is either better or worse than theirs, and I wouldn’t believe in my way unless I though it better. Of course, I am also assuming that truth isn’t a terminal value to them. In the same way, since I don’t want my utility function changed, I’d rather not do it to them.
Fair enough. If you have any insight into why your preferences rank in this way, I’d be interested, but I accept that they are what they are.
However, I’m now confused about your claim.
Are you saying that we ought to treat other people in accordance with your preferences of how to be treated (e.g., lied to in the present rather than having their values changed or their memories altered)? Or are you just talking about how you’d like us to treat you? Or are you assuming that other people have the same preferences you do?
For the preference ranking, I guess I can mathematically express it by saying that any priority change leads to me doing stuff that would be utility+ at the time, but utility- or utilityNeutral (and since I could be spending the time generating utility+ instead, even neutral is bad) now. For example, if I could change my utility function to eating babies, and babies were plentiful, this option would result in a huge source of utility+ after the change. Which doesn’t change the fact that it also means I’d eat a ton of babies, which makes the option a huge source of utility- currently—I wouldn’t want to do something that would lead to me eating a ton of babies. If I greatly valued generating as much utility+ for myself at any moment as possible, I would take the plunge; however, I look at the future, decide not to take what is currently utility- for me, and move on. Or maybe I’m just making up excuses to refuse to take a momentary discomfort for eternal utility+ - after all, I bet someone having the time of his life eating babies would laugh at me and have more fun than me—the inconsistency here is that I avoid the utility- choice when it comes to changing my terminal values, but I have no issue taking the utility- choice when I decide I want to be in a simulation. Guess I don’t value truth that much. I find that changing my memories leads to similar results as changing my utility function, but on a much, much smaller scale—after all, they are what make up my beliefs, preferences, myself as a person. Changing them at all changes my belief system and preference; but that’s happening all the time. Changing them on a large scale is significantly worse in terms of affecting my utility function—it can’t change my terminal values, so still far less bad than directly making me interested in eating babies, but still negative. Getting lied to is just bad, with no relation to the above two, and weakest in importance.
My gut says that I should treat others as I want them to treat me. Provided a simulation is a bit more awesome, or comparably awesome but more efficient, I’d rather take that than the real thing. Hence, I’d want to give others what I myself prefer (in terms of ways to achieve preferences) - not because they are certain to agree that being lied to is better than angsting about not helping people, but because my way is either better or worse than theirs, and I wouldn’t believe in my way unless I though it better. Of course, I am also assuming that truth isn’t a terminal value to them. In the same way, since I don’t want my utility function changed, I’d rather not do it to them.