If you want to, say, stop people from starving to death, would you be satisfied with being plopped on a holodeck with images of non-starving people? If so, then your stop-people-from-starving-to-death desire is not a desire to optimize reality into a smaller set of possible world-states, but simply a desire to have a set of sensations so that you believe starvation does not exist. The two are really different.
If you don’t understand what I’m saying, the first two paragraphs of this comment might explain it better.
Oh. In that case, it might be more precise to say that your utility function does not assign positive or negative utility to the suffering of others (if I’m interpreting your statement correctly). However, I’m curious about whether this statement holds true for you at extremes, so here’s a hypothetical.
I’m going to assume that you like ice cream. If you don’t like any sort of ice cream, substitute in a certain quantity of your favorite cookie. If you could get a scoop of ice cream (or a cookie) for free at the cost of a million babies thumbs cut off, would you take the ice cream/cookie?
If not, then you assign a non-zero utility to others suffering, so it might be true that you care very little, but it’s not true that you don’t care at all.
I think you misunderstand slightly. Sensory experience includes having the idea communicated to me that my action is causing suffering. I assign negative utility to other’s suffering in real life because the thought of such suffering is unpleasant.
Alright. Would you take the offer if Omega promised that he would remove your memories of the agreement of having a million babies’ thumbs cut off for a scoop of ice cream right after you made the agreement, so you could enjoy your ice-cream without guilt?
no, at the time of the decision i have sensory experience of having been the cause of suffering.
I don’t feel responsibility to those who suffer in that I would choose to holodeck myself rather than stay in reality and try to fix problems. this does not mean that I will cause suffering on purpose.
a better hypothetical dilemma might be if I could ONLY get access to the holodeck if I cause others to suffer (cypher from the matrix).
Mmkay. I would say that our utility functions are pretty different, in that case, since, with regard to suffering, I value world-states according to how much suffering they contain, not according to who causes the suffering.
I have never understood what is wrong with the amnesia-holodecking scenario. (is there a proper name for this?)
If you want to, say, stop people from starving to death, would you be satisfied with being plopped on a holodeck with images of non-starving people? If so, then your stop-people-from-starving-to-death desire is not a desire to optimize reality into a smaller set of possible world-states, but simply a desire to have a set of sensations so that you believe starvation does not exist. The two are really different.
If you don’t understand what I’m saying, the first two paragraphs of this comment might explain it better.
thanks for clarifying. I guess I’m evil. It’s a good thing to know about oneself.
Uh, that was a joke, right?
no.
What definition of evil are you using? I’m having trouble understanding why (how?) you would declare yourself evil, especially evil_nazgulnarsil.
i don’t care about suffering independent of my sensory perception of it causing me distress.
Oh. In that case, it might be more precise to say that your utility function does not assign positive or negative utility to the suffering of others (if I’m interpreting your statement correctly). However, I’m curious about whether this statement holds true for you at extremes, so here’s a hypothetical.
I’m going to assume that you like ice cream. If you don’t like any sort of ice cream, substitute in a certain quantity of your favorite cookie. If you could get a scoop of ice cream (or a cookie) for free at the cost of a million babies thumbs cut off, would you take the ice cream/cookie?
If not, then you assign a non-zero utility to others suffering, so it might be true that you care very little, but it’s not true that you don’t care at all.
I think you misunderstand slightly. Sensory experience includes having the idea communicated to me that my action is causing suffering. I assign negative utility to other’s suffering in real life because the thought of such suffering is unpleasant.
Alright. Would you take the offer if Omega promised that he would remove your memories of the agreement of having a million babies’ thumbs cut off for a scoop of ice cream right after you made the agreement, so you could enjoy your ice-cream without guilt?
no, at the time of the decision i have sensory experience of having been the cause of suffering.
I don’t feel responsibility to those who suffer in that I would choose to holodeck myself rather than stay in reality and try to fix problems. this does not mean that I will cause suffering on purpose.
a better hypothetical dilemma might be if I could ONLY get access to the holodeck if I cause others to suffer (cypher from the matrix).
Okay, so you would feel worse if you had caused people the same amount of suffering than you would if someone else had done so?
yes
Mmkay. I would say that our utility functions are pretty different, in that case, since, with regard to suffering, I value world-states according to how much suffering they contain, not according to who causes the suffering.
Well, it’s essentially equivalent to wireheading.
which I also plan to do if everything goes tits-up.