The central problem in all of these thought experiments is the crazy notion that we should give a shit about the welfare of other minds simply because they exist and experience things analogously to the way we experience things.
Well, I see the central problem in the notion that we should care about something that happens to other people if we’re not the ones doing it to them. Clearly, the aliens are sentient; they are morally responsible for what happens to these humans. While we certainly should pursue possible avenues to end the suffering, we shouldn’t act as if we were.
Interesting. Though in the scenario I suggested there is no suffering. Only an opportunity to deploy pleasure (ice cream).
I’m curious as to your reasons why you hold the aliens morally responsible for the human clones—I can imagine several reasons, but wonder what yours are. Also, I am curious as to whether you think that the existence of someone with greater moral responsibility than our own acts to decrease or eliminate the small amount of moral responsibility that we Earthlings have in this case.
Why would I not hold them responsible? They are the ones who are trying to make us responsible by giving us an opportunity to act, but their opportunities are much more direct—after all, they created the situation that exerts the pressure on us. This line of thought is mainly meant to be argued in Fred’s terms, who has a problem with feeling responsible for this suffering (or non-pleasure) - it offers him an out of the conundrum without relinquishing his compassion for humanity (i.e. I feel the ending as written is illogical, and I certainly think “Michael” is acting very unprofessionally for a psychoanalyst). [“Relinquish the compassion” is also the conclusion you seem to have drawn, thus my response here.]
Of course, the alien strategy might not be directed at our sense of responsibility, but at some sort of game theoretic utility function that proposes the greater good for the greater number—these utility functions are always sort of arbitrary (most of them on lesswrong center around money, with no indication why money should be valuable), and the arbitrariness in this case consists in including the alien simulations, but not the aliens themselves. If the aliens are “rational agents”, then not rewarding their behaviour will make them stop it if it has a cost, while rewarding it will make them continue. (Haven’t you ever wondered how many non-rational entities are trying to pose conundrums to rational agents on here? ;)
I don’t have a theory of quantifyable responsibility, and I don’t have a definite answer for you. Let’s just say there is only a limited amount of stuff we can do in the time that we have, so we have to make choices what to do with our lives. I hope that Fred comes to feel that he can accomplish more with his life than to indirectly die for a tortured simulation that serves alien interests.
Well, I see the central problem in the notion that we should care about something that happens to other people if we’re not the ones doing it to them. Clearly, the aliens are sentient; they are morally responsible for what happens to these humans. While we certainly should pursue possible avenues to end the suffering, we shouldn’t act as if we were.
Interesting. Though in the scenario I suggested there is no suffering. Only an opportunity to deploy pleasure (ice cream).
I’m curious as to your reasons why you hold the aliens morally responsible for the human clones—I can imagine several reasons, but wonder what yours are. Also, I am curious as to whether you think that the existence of someone with greater moral responsibility than our own acts to decrease or eliminate the small amount of moral responsibility that we Earthlings have in this case.
Why would I not hold them responsible? They are the ones who are trying to make us responsible by giving us an opportunity to act, but their opportunities are much more direct—after all, they created the situation that exerts the pressure on us. This line of thought is mainly meant to be argued in Fred’s terms, who has a problem with feeling responsible for this suffering (or non-pleasure) - it offers him an out of the conundrum without relinquishing his compassion for humanity (i.e. I feel the ending as written is illogical, and I certainly think “Michael” is acting very unprofessionally for a psychoanalyst). [“Relinquish the compassion” is also the conclusion you seem to have drawn, thus my response here.]
Of course, the alien strategy might not be directed at our sense of responsibility, but at some sort of game theoretic utility function that proposes the greater good for the greater number—these utility functions are always sort of arbitrary (most of them on lesswrong center around money, with no indication why money should be valuable), and the arbitrariness in this case consists in including the alien simulations, but not the aliens themselves. If the aliens are “rational agents”, then not rewarding their behaviour will make them stop it if it has a cost, while rewarding it will make them continue. (Haven’t you ever wondered how many non-rational entities are trying to pose conundrums to rational agents on here? ;)
I don’t have a theory of quantifyable responsibility, and I don’t have a definite answer for you. Let’s just say there is only a limited amount of stuff we can do in the time that we have, so we have to make choices what to do with our lives. I hope that Fred comes to feel that he can accomplish more with his life than to indirectly die for a tortured simulation that serves alien interests.