I don’t think people have (ethical) value simply because they exist. I think they should have to do a lot more than that before I should have to care whether they live or die.
Interestingly, you may not care whether a person exists (so you will be indifferent to the instantiation of more people), but still care about how he lives, and whether he dies, and in what manner.
I wouldn’t personally object, no. This is happening every day and, like most people, I do nothing. The difference is I don’t think I’m supposed to be doing anything either. That isn’t to say we should live in a society without laws or moral strictures; you need a certain amount of protection for society to function at all. You can’t condone random violence. But this is a pragmatic rather than altruistic concern.
Hm. Upvoted for an honest answer and lack of dissembling. Let’s make it harder.
You have a button. If you press the button, you will receive a (free!) delicious pie, and a random child will be tortured for one year. No one will ever know there was any connection to you, and you can even erase your memory so that you won’t feel guilty about it afterwards. Assume you like pie. Do you press the button?
This is very bizarre situation and difficult to think about but I think there’s a chance I would press the button. My main issue is that children require some kind of protection because they’re our only source of valuable adults. Childhood is probably the worst time to torture people in terms of long-term side effects. But in terms of merely causing the experience of suffering (which I think is what you’re getting at) I think torture is value-neutral.
This is a slightly different matter to the one I initially posted about; I don’t think the experience of pain (or happiness) is cumulative. Consider the situation where I could choose to be tortured for a year to receive a reward. If you could strip this scenario of long-term side effects, which would probably require erasing my memory afterwards, then I would willingly undergo the torture for a reward. The reward would have to compensatory for the loss of time, the discomfort and the impracticality of the scenario. If I really liked pie I’d probably be willing to undergo 5 minutes of torture without long-term side effects for pie. Actually, I’d probably be willing to do it for 5 minutes purely out of curiosity.
Now, the child in question, assuming he or she has no value and comes from a community where he or she would not become a valuable adult, could not have long-term side effects. He or she would surely be changed by the situation but not being a value-contributor could not be changed for the worse; any change would be value-neutral in terms of benefit to the cumulative wealth of society. (There is a possibility that the child would become a greater strain on society, and acquire greater negative value, but let’s put this aside and say there are no major long-term side effects of the torture such as loss of function.)
A complication here is the value I place on pie in your scenario would be unlikely given how I determine value generally. As I said, I do not consider the experience of pain or pleasure cumulative, and consider them value-neutral in general. I would not place a high value on the consumption of pie. But let us say that my love of pie is a part of my general need to stay healthy and happy in order to be a value-contributor. In this case, whether I push the button would be some function of the probability that the child might be a child of value or from a community that produces adults of value weighed against the value of pie to me as a value-contributor, so there’s a non-zero probability I would push the button.
It’s worth pointing out that the original comment concerned living or dying, not torture.
Myself, I would avoid the torture button, but would give serious consideration to pressing one that delivered a delicious pie at the cost of painlessly puffing a random faraway person out of existence.
If the button delivered a sufficiently large amount of money, I would press it for sure. Would require much more money for torture than death, however. (Like $1 million versus a few bucks.)
I wouldn’t press the button, though I had to think a bit longer about the “erase from memory” part.
It reminds me of what Eliezer often says about Friendly AI: “If you offered Gandhi a pill that would make gandhi a murderer, gandhi would refuse to take it.”
I would also refuse to do it even if my memory could be erased. Somehow, I don’t feel it’s really relevant, because when I’m considering wether to do it or not, I’m not even thinking about any guilt I might feel, I’m mostly repulsed by torture in general and imagining myself in the place of the person to be tortured.
I don’t think I would have any particular problem with murder for an adequate reason, and I wouldn’t take a “murder pill”. A stupid illustration—though I don’t remember seeing this phrase before and I’ve been following OB from the first post.
I don’t think people have (ethical) value simply because they exist. I think they should have to do a lot more than that before I should have to care whether they live or die.
Interestingly, you may not care whether a person exists (so you will be indifferent to the instantiation of more people), but still care about how he lives, and whether he dies, and in what manner.
So if I were to start torturing a random child, would you object? Assume the child has never done anything important to make him especially valuable.
I wouldn’t personally object, no. This is happening every day and, like most people, I do nothing. The difference is I don’t think I’m supposed to be doing anything either. That isn’t to say we should live in a society without laws or moral strictures; you need a certain amount of protection for society to function at all. You can’t condone random violence. But this is a pragmatic rather than altruistic concern.
Hm. Upvoted for an honest answer and lack of dissembling. Let’s make it harder.
You have a button. If you press the button, you will receive a (free!) delicious pie, and a random child will be tortured for one year. No one will ever know there was any connection to you, and you can even erase your memory so that you won’t feel guilty about it afterwards. Assume you like pie. Do you press the button?
This is very bizarre situation and difficult to think about but I think there’s a chance I would press the button. My main issue is that children require some kind of protection because they’re our only source of valuable adults. Childhood is probably the worst time to torture people in terms of long-term side effects. But in terms of merely causing the experience of suffering (which I think is what you’re getting at) I think torture is value-neutral.
This is a slightly different matter to the one I initially posted about; I don’t think the experience of pain (or happiness) is cumulative. Consider the situation where I could choose to be tortured for a year to receive a reward. If you could strip this scenario of long-term side effects, which would probably require erasing my memory afterwards, then I would willingly undergo the torture for a reward. The reward would have to compensatory for the loss of time, the discomfort and the impracticality of the scenario. If I really liked pie I’d probably be willing to undergo 5 minutes of torture without long-term side effects for pie. Actually, I’d probably be willing to do it for 5 minutes purely out of curiosity.
Now, the child in question, assuming he or she has no value and comes from a community where he or she would not become a valuable adult, could not have long-term side effects. He or she would surely be changed by the situation but not being a value-contributor could not be changed for the worse; any change would be value-neutral in terms of benefit to the cumulative wealth of society. (There is a possibility that the child would become a greater strain on society, and acquire greater negative value, but let’s put this aside and say there are no major long-term side effects of the torture such as loss of function.)
A complication here is the value I place on pie in your scenario would be unlikely given how I determine value generally. As I said, I do not consider the experience of pain or pleasure cumulative, and consider them value-neutral in general. I would not place a high value on the consumption of pie. But let us say that my love of pie is a part of my general need to stay healthy and happy in order to be a value-contributor. In this case, whether I push the button would be some function of the probability that the child might be a child of value or from a community that produces adults of value weighed against the value of pie to me as a value-contributor, so there’s a non-zero probability I would push the button.
It’s beside the point, but your idea of torture might be a bit light if you would undergo five minutes out of curiosity.
Maybe he’s thinking of water-boading.
It’s worth pointing out that the original comment concerned living or dying, not torture.
Myself, I would avoid the torture button, but would give serious consideration to pressing one that delivered a delicious pie at the cost of painlessly puffing a random faraway person out of existence.
If the button delivered a sufficiently large amount of money, I would press it for sure. Would require much more money for torture than death, however. (Like $1 million versus a few bucks.)
I wouldn’t press the button, though I had to think a bit longer about the “erase from memory” part.
It reminds me of what Eliezer often says about Friendly AI: “If you offered Gandhi a pill that would make gandhi a murderer, gandhi would refuse to take it.”
I would also refuse to do it even if my memory could be erased. Somehow, I don’t feel it’s really relevant, because when I’m considering wether to do it or not, I’m not even thinking about any guilt I might feel, I’m mostly repulsed by torture in general and imagining myself in the place of the person to be tortured.
I don’t think I would have any particular problem with murder for an adequate reason, and I wouldn’t take a “murder pill”. A stupid illustration—though I don’t remember seeing this phrase before and I’ve been following OB from the first post.
Example: X wouldn’t Y.
Rejoinder: Z, which is unlike X in relevant ways, would also not Y.
...huh?
More like: Z, which you could expect to be less bothered by Y than X, also would not Y.
A quick Google search reveals the Gandhi phrase on Eliezer’s website:
http://yudkowsky.net/singularity
But I think I saw it in at least one of his papers too.
Somehow I missed this comment or else I would not have said the same thing in my comment.