ah, it also annoys me when people say that caring about others can only be instrumental.
what does it even mean? helping other people makes me feel happy. watching a nice movie makes me feel happy. the argument that I don’t “really” care about other people would also prove that I don’t “really” care about movies etc.
I am happy for the lucky coincidence that decision theories sometimes endorse cooperation, but I would probably do that regardless. for example, if I had an option to donate something useful to million people, or sell it to dozen people, I would probably choose the former option even if it meant no money for me. (and yes, I would hope there would be some win/win solution, such as the million people paying me via Kickstarter. but in the inconvenient universe where Kickstarter is somehow not an option, I am going to donate anyway.)
There actually is a meaningful question there: Would you enter the experience machine? Or do you need it to be real. Do you just want the experience of pleasing others or do you need those people being pleased out there to actually exist.
There are a lot of people who really think, and might truly be experience oriented. If given the ability, they would instantly self-modify into a Victory Psychopath Protecting A Dream.
I don’t have an explicit theory of how this works; for example, I would consider “pleasing others” in an experience machine meaningless, but “eating a cake” in an experience machine seems just as okay as in real life (maybe even preferable, considering that cakes are unhealthy). A fake memory of “having eaten a cake” would be a bad thing; “making people happier by talking to them” in an experience machine would be intrinsically meaningless, but it might help me improve my actual social skills, which would be valuable. Sometimes I care about the referent being real (the people I would please), sometimes I don’t (the cake I would eat). But it’s not the people/cake distinction per se; for example in case of using fake simulated people to practice social skills, the emphasis is on the skills being real; I would be disappointed if the experience machine merely gave me a fake “feeling of having improved my skills”.
I imagine that for a psychopath everything and everyone is instrumental, so there would be no downside to the experience machine (except for the risk of someone turning it off). But this is just a guess.
I suspect that analyzing “the true preferences” is tricky, because ultimately we are built of atoms, and atoms have no preferences. So the question is whether by focusing on some aspect of the human mind we got better insight to its true nature, or whether we have just eliminated the context that was necessary for it to make sense.
ah, it also annoys me when people say that caring about others can only be instrumental.
what does it even mean? helping other people makes me feel happy. watching a nice movie makes me feel happy. the argument that I don’t “really” care about other people would also prove that I don’t “really” care about movies etc.
I am happy for the lucky coincidence that decision theories sometimes endorse cooperation, but I would probably do that regardless. for example, if I had an option to donate something useful to million people, or sell it to dozen people, I would probably choose the former option even if it meant no money for me. (and yes, I would hope there would be some win/win solution, such as the million people paying me via Kickstarter. but in the inconvenient universe where Kickstarter is somehow not an option, I am going to donate anyway.)
There actually is a meaningful question there: Would you enter the experience machine? Or do you need it to be real. Do you just want the experience of pleasing others or do you need those people being pleased out there to actually exist.
There are a lot of people who really think, and might truly be experience oriented. If given the ability, they would instantly self-modify into a Victory Psychopath Protecting A Dream.
I don’t have an explicit theory of how this works; for example, I would consider “pleasing others” in an experience machine meaningless, but “eating a cake” in an experience machine seems just as okay as in real life (maybe even preferable, considering that cakes are unhealthy). A fake memory of “having eaten a cake” would be a bad thing; “making people happier by talking to them” in an experience machine would be intrinsically meaningless, but it might help me improve my actual social skills, which would be valuable. Sometimes I care about the referent being real (the people I would please), sometimes I don’t (the cake I would eat). But it’s not the people/cake distinction per se; for example in case of using fake simulated people to practice social skills, the emphasis is on the skills being real; I would be disappointed if the experience machine merely gave me a fake “feeling of having improved my skills”.
I imagine that for a psychopath everything and everyone is instrumental, so there would be no downside to the experience machine (except for the risk of someone turning it off). But this is just a guess.
I suspect that analyzing “the true preferences” is tricky, because ultimately we are built of atoms, and atoms have no preferences. So the question is whether by focusing on some aspect of the human mind we got better insight to its true nature, or whether we have just eliminated the context that was necessary for it to make sense.