Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities
Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons.
I do sincerely apologize if you are offended, but rape is torture as well and Eliezer’s example can be equally if not more reprehensible.
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
I just went over how the scenarios differ from each other in considerable detail. I could repeat myself in grotesque detail, but I’m starting to think it wouldn’t buy very much for me, for you, or for anyone who might be reading this exchange.
So let’s try another angle. It sounds to me like you’re trying to draw an ethical equivalence between dust-subjects in TvDS and rapists in TVCR: more than questionable in real life, but I’ll grant that level of suffering to the latter for the sake of argument. Also misses the point of drawing attention to scope insensitivity, but that’s only obvious if you’re running a utilitarian framework already, so let’s go ahead and drop it for now. That leaves us with the mathematics of the scenarios, which do have something close to the same form.
Specifically: in both cases we’re depriving some single unlucky subject of N utility in exchange for not withholding N \ K utility divided up among several subjects for some K > 1. At this level we can establish a mapping between both thought experiments, although the exact K*, the number of subjects, and the normative overtones are vastly, sillily different between the two.
Fine so far, but you seem to be treating this as an open-and-shut argument on its own: “you surely would not let the victim [suffer]”. Well, that’s begging the question, isn’t it? From a utilitarian perspective it doesn’t matter how many people we divide up N \ K* among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering. The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
About the only way I can make sense of what you’re saying is by treating the N case—and not just for the sake of argument, but as an unquestioned base assumption—as a special kind of evil, incommensurate with any lesser crime. Which, frankly, I don’t. It all gets mapped to people’s preferences in the end, no matter how squicky and emotionally loaded the words you choose to describe it are.
From a utilitarian perspective it doesn’t matter how many people we divide up N * K among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering.
I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.
The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
Again, I feel as if you are making my argument for me. The problem is as you say obvious to the trolley problem on how we cast it.
You say my experiment is not really the same as Eliezer’s. fine. If doesn’t matter because we could just use your example. If utilitarians do not care for how many people we divide N*K with, then these utilitarians should state that they would indeed allow T to happen no matter what subject matter the K is as long as K is >1
The thing is, thought experiments are supposed to illustrate something. Right now, your proposed thought experiment is illustrating “we have trouble articulating our thoughts about rape” which is (1) obvious and (2) does not need most of the machinery in the thought experiment.
I do sincerely apologize if you are offended, but rape is torture as well and Eliezer’s example can be equally if not more reprehensible.
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
I just went over how the scenarios differ from each other in considerable detail. I could repeat myself in grotesque detail, but I’m starting to think it wouldn’t buy very much for me, for you, or for anyone who might be reading this exchange.
So let’s try another angle. It sounds to me like you’re trying to draw an ethical equivalence between dust-subjects in TvDS and rapists in TVCR: more than questionable in real life, but I’ll grant that level of suffering to the latter for the sake of argument. Also misses the point of drawing attention to scope insensitivity, but that’s only obvious if you’re running a utilitarian framework already, so let’s go ahead and drop it for now. That leaves us with the mathematics of the scenarios, which do have something close to the same form.
Specifically: in both cases we’re depriving some single unlucky subject of N utility in exchange for not withholding N \ K utility divided up among several subjects for some K > 1. At this level we can establish a mapping between both thought experiments, although the exact K*, the number of subjects, and the normative overtones are vastly, sillily different between the two.
Fine so far, but you seem to be treating this as an open-and-shut argument on its own: “you surely would not let the victim [suffer]”. Well, that’s begging the question, isn’t it? From a utilitarian perspective it doesn’t matter how many people we divide up N \ K* among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering. The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
About the only way I can make sense of what you’re saying is by treating the N case—and not just for the sake of argument, but as an unquestioned base assumption—as a special kind of evil, incommensurate with any lesser crime. Which, frankly, I don’t. It all gets mapped to people’s preferences in the end, no matter how squicky and emotionally loaded the words you choose to describe it are.
I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.
Again, I feel as if you are making my argument for me. The problem is as you say obvious to the trolley problem on how we cast it.
You say my experiment is not really the same as Eliezer’s. fine. If doesn’t matter because we could just use your example. If utilitarians do not care for how many people we divide N*K with, then these utilitarians should state that they would indeed allow T to happen no matter what subject matter the K is as long as K is >1
The thing is, thought experiments are supposed to illustrate something. Right now, your proposed thought experiment is illustrating “we have trouble articulating our thoughts about rape” which is (1) obvious and (2) does not need most of the machinery in the thought experiment.