This reminds me of the horrible SIAI job interview question: “Would you kill babies if it was intrinsically the right thing to do? If so, how right would it have to be, for how many babies?”
That one’s easy—the correct answer is to stop imagining “rightness” as something logically independent from torturing babies, and forget the word “intrinsic” like a bad dream. My question looks a little more tricky to me, it’s closer to torture vs dust specks.
I wasn’t saying that they were similar questions, just that one reminded me of the other. (Though I can see why one would think that.)
I’d say the answer to this is pretty simple. Laura ABF (if I remember the handle correctly) suggested of the original Torture vs. Specks dilemma that the avoided specks be replaced with 3^^^3 people having really great sex—which didn’t change the answer for me, of course. This provides a similar template for the current problem: Figure out how many victims you would dustspeck in exchange for two beneficiaries having a high-quality sexual encounter, then figure out how many dustspecks correspond to a month of torture, then divide the second number by the first.
Figure out how many victims you would dustspeck in exchange for two beneficiaries having a high-quality sexual encounter
I guess you meant “how many sexual encounters would you demand to make a million dustspecks worthwhile”. And my emotional response is the same as in the original dilemma: I find it reeeallly icky to trade off other people’s pain for other people’s pleasure (not a pure negative utilitarian but pretty close), even though I’m willing to suffer pain myself in exchange for relatively small amounts of pleasure. And it gets even harder to trade if the people receiving the pain are unrelated to the people receiving the pleasure. (What if some inhabitants of the multiverse are marked from birth as “lower class”, so they always receive the pain whenever someone agrees to such dilemmas?) I’m pretty sure this is a relevant fact about my preferences, not something that must be erased by utilitarianism.
And even in the original torture vs dustspecks dilemma the answer isn’t completely obvious to me. The sharpest form of the dilemma is this: your loved one is going to be copied a huge number of times, would you prefer one copy to be tortured for 50 years, or all of them to get a dustspeck in the eye?
I find it reeeallly icky to trade off other people’s pain for other people’s pleasure
Does it work the same in reverse? How many high-quality sexual encounters would you be willing to interrupt while you are out saving people from dustspecks?
Interrupting sexual encounters isn’t the same as preventing them from occurring without anyone knowing. Regardless of what utilitarianism prescribes, the preferences of every human are influenced by the welfare level they have anchored to. If you find a thousand dollars and then lose them, that’s unpleasant, not neutral. Keep that in mind, or you’ll be applying the reversal test improperly.
My problem mentioned that the people receiving additional pleasure must be currently at average level of pleasure. Having your sex encounter interrupted brings you below average, I think.
Horrible in the sense of being frustrating to have to answer, or horrible in the sense of not being a useful job interview question?
It’s exactly the sort of question I hate to be asked when I have nobody to ask for clarification, because I have no idea how I’m supposed to interpret it. Am I supposed to assume the intrinsic rightness has no relation to human values? Is it testing my ability to imagine myself into a hypothetical universe where killing babies actually makes people happier and better off?
It’s a popular religious stance that before a certain age, babies have no sin, and get an automatic pass to heaven if they die. I’ve argued before that if that were the case, one of the most moral things one could do would be killing as many babies as you could get away with. After all, you can only get condemned to hell once, so on net you can get a lot more people into heaven that way than would have gone otherwise.
I can certainly imagine situations where I would choose to kill babies (for example, where I estimate a high probability that this baby continuing to live will result in other people dying), but I assume that doesn’t qualify as “intrinsically the right thing to do.”
This reminds me of the horrible SIAI job interview question: “Would you kill babies if it was intrinsically the right thing to do? If so, how right would it have to be, for how many babies?”
That one’s easy—the correct answer is to stop imagining “rightness” as something logically independent from torturing babies, and forget the word “intrinsic” like a bad dream. My question looks a little more tricky to me, it’s closer to torture vs dust specks.
I wasn’t saying that they were similar questions, just that one reminded me of the other. (Though I can see why one would think that.)
I’d say the answer to this is pretty simple. Laura ABF (if I remember the handle correctly) suggested of the original Torture vs. Specks dilemma that the avoided specks be replaced with 3^^^3 people having really great sex—which didn’t change the answer for me, of course. This provides a similar template for the current problem: Figure out how many victims you would dustspeck in exchange for two beneficiaries having a high-quality sexual encounter, then figure out how many dustspecks correspond to a month of torture, then divide the second number by the first.
I suspect it was probably LauraABJ.
I guess you meant “how many sexual encounters would you demand to make a million dustspecks worthwhile”. And my emotional response is the same as in the original dilemma: I find it reeeallly icky to trade off other people’s pain for other people’s pleasure (not a pure negative utilitarian but pretty close), even though I’m willing to suffer pain myself in exchange for relatively small amounts of pleasure. And it gets even harder to trade if the people receiving the pain are unrelated to the people receiving the pleasure. (What if some inhabitants of the multiverse are marked from birth as “lower class”, so they always receive the pain whenever someone agrees to such dilemmas?) I’m pretty sure this is a relevant fact about my preferences, not something that must be erased by utilitarianism.
And even in the original torture vs dustspecks dilemma the answer isn’t completely obvious to me. The sharpest form of the dilemma is this: your loved one is going to be copied a huge number of times, would you prefer one copy to be tortured for 50 years, or all of them to get a dustspeck in the eye?
Does it work the same in reverse? How many high-quality sexual encounters would you be willing to interrupt while you are out saving people from dustspecks?
Interrupting sexual encounters isn’t the same as preventing them from occurring without anyone knowing. Regardless of what utilitarianism prescribes, the preferences of every human are influenced by the welfare level they have anchored to. If you find a thousand dollars and then lose them, that’s unpleasant, not neutral. Keep that in mind, or you’ll be applying the reversal test improperly.
My problem mentioned that the people receiving additional pleasure must be currently at average level of pleasure. Having your sex encounter interrupted brings you below average, I think.
Horrible in the sense of being frustrating to have to answer, or horrible in the sense of not being a useful job interview question?
It’s exactly the sort of question I hate to be asked when I have nobody to ask for clarification, because I have no idea how I’m supposed to interpret it. Am I supposed to assume the intrinsic rightness has no relation to human values? Is it testing my ability to imagine myself into a hypothetical universe where killing babies actually makes people happier and better off?
It’s a popular religious stance that before a certain age, babies have no sin, and get an automatic pass to heaven if they die. I’ve argued before that if that were the case, one of the most moral things one could do would be killing as many babies as you could get away with. After all, you can only get condemned to hell once, so on net you can get a lot more people into heaven that way than would have gone otherwise.
I can certainly imagine situations where I would choose to kill babies (for example, where I estimate a high probability that this baby continuing to live will result in other people dying), but I assume that doesn’t qualify as “intrinsically the right thing to do.”
That said, I’m not sure what would.