No one ever understands just how friggin large 3^^^3 is.
One could safely argue that it is better for the entire current world population to suffer a dust speck each than for someone to get tortured for fifty years, but expand that to 3^^^3 people? Radically different story.
One could safely argue that it is better for the entire current world population to suffer a dust speck each than for someone to get tortured for fifty years,
Would anyone challenge this? Doing the arithmetic, if dust specks cause one second of discomfort, having everyone on Earth get specked would be the equivalent of ~222 person-years of specking. If torture is at least 4.5 times as bad as specking on a per-second basis, even a straight total utilitarian calculation with constant badness per second would favor specks on that scale. That seems like a very, very easy case.
For what N would you say that {N specks better-than torture better-than N+1 specks}? If small quantities of utility or disutility have perfectly additive properties across groups of people, it should be simple to provide a concrete answer.
(Sidenote—there should be symbolic terminiology for better-than and worse-than. “>” and “<” would just be confusing in this context.)
That is interesting. But note that he was starting with a unit of 1 second of torture. 1 second of waterboarding is not 1/30th as distressing as 30 seconds of waterboarding. And 1 second of Chinese water torture, or the ice room, or simply isolation, is less disutility than dust speck. Actually, the particular case of isolation is one where the 50th year probably is worse than the first year, assuming the victim has not already gone completely bonkers by that point.
I don’t have a better idea. I really don’t have enough knowledge about how to torture people at extreme levels over a long period given the possibility of sufficiently advanced technology.
I thought you were going somewhere else with the second sentence. My natural thought, after admitting that I can’t possibly understand how big 3^^^3 is, was that if one prefers the torture for one person for 50 years, it should be true that one would also prefer to torture the entire current world population for 50 years over the, as D227 called it, the Dust Holocaust.
Is it? 3^^^3 isn’t all that much of a ridiculous number. Larger than the number of atoms in the universe, certainly, but not so much so that certain people’s methods of non-linear valuations of disutility per speck couldn’t make that kind of difference matter. (I tend to prefer at least 3^^^^3 for my stupid-large-numbers.)
Larger than the number of atoms in the universe, certainly,
That’s quite a bit of an understatement. 3^^4 (~10^3638334640025) is already vastly larger than the number of atoms in the universe (~10^80), 3^^5 in turn is incomprehensibly larger again, and 3^^^3 = 3^^7625597484987.
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision. (Or making yourself money-pumpable by choosing different sides of the same deal when split in 10^80 sub-deals)
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision.
Not necessarily. For a lot of people the limit of disutility of the scenario as number of dustspecks approaches infinity is non even infinite. In such cases it is perfectly plausible—and even likely—that it is considered worse that torturing one person but not as bad as torturing 10^80 people. (In which case the extra Knuth arrow obviously doesn’t help either.)
See the comment in the parentheses. Choosing torture over 3^^^^3 dust specks, but not 3^^^3*10^-80 dust specks takes extraordinary precision. Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Your comment in the parenthesis (if you were referring to the one when you were saying requires you to be money pumpable) was false but I was letting it pass. If you are telling me to see my own comment in parentheses that says about same thing as your second sentence then, well, yes we are mostly in agreement about that part, albeit not quite to the same degree.
Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Just not true. It implies preferences in which 10^80 tortures is not 10^80 times worse than 1 torture. There isn’t anything inconsistent about not valuing increased instances of the same thing to be worth a different amount than the previous instance. In fact it is the usual case. It is also not exploitable—anything you can make an agent with those preferences do based on its own preferences will be something it agrees in hindsight is a good thing to do.
Can you explain how such a preference can be consistent? The total incidence of both torture and dust specks is unknown in either case. On what basis would an agent that trades one torture for avoiding 3^^^3*10^-80 dust specks refuse the same deal a second time? Or the 10^80th time? Given that 3^^^3*10^-80 people are involved it seems astronomically unlikely that the rate of torture changed noticeably even only assuming knowledge available to the agent. In any case 10^80 separate instances of the agent with no knowledge of each other would make the same deal 10^80 times, and can’t complain about being deceived since no information about the incidence of torture was assumed. Even assuming the agent only makes the deal only a single time consistency would then require that the agent prefer trading 3^^^3 dust specks for avoiding 10^80 instances of torture over trading 3^^^3*(1+10^-80) dust specks for 10^80 +1 instances of torture, which seems implausible.
The total incidence of both torture and dust specks is unknown in either case.
Where was this declared? (Not that it matters for the purpose of this point.) The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks. It is impossible not to. And after taking one such deal those priors will be different. Sure, restricting the access to information about the current tortured population will make it harder for an agent to implement preferences that are not linear with respect to additional units but it doesn’t make those preferences inconsistent and it doesn’t stop the agent doing its best to maximise utility despite the difficulty.
There is no information on the total incidence of either included in the problem statement (other than the numbers used), and I have seen no one answer conditionally based on the incidence of either.
The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks.
Yes, of course, I thought my previous comment clearly implied that?
And after taking one such deal those priors will be different.
Infinitesimally. I thought I addressed that? The problem implies the existence of an enormous number of people. Conditional on there actually being that many people the expected number of people tortured shifts by the tiniest fraction of the total. If the agent is sensitive to such a tiny shift we are back to requiring extraordinary precision.
That is the crux of the problem. Bob understands just as much as you claim you understand what 3^^^3 is. Yet he chooses the “Dust Holocaust”.
First let me assume that you, peter_hurford, are a “Torturer” or rather, you are from the camp that obviously chooses 50 years. I have no doubt in my mind that you bring extremely rational and valid points to this discussions. You are poking holes in Bobs reasoning at its weakest points. This is a good thing.
I whole-heartedly concede that you have compelling points, by poking into holes into Bob’s reasons. But lets start poking around your reasoning now.
Omega has given you choice to allow or disallow 10 rapist to rape someone. Why 10 rapist? Omega knows the absolute utility across all humans, and unfortunately as terrible as it sounds, the suffering/torturing of 10 rapist not being able to rape is more suffering than what the victim feels. What do you do? 10 is < 3^^^3 suffering rapists. So lucky you, Omega need not burden you with the suffering of 3^^^3, if you chose to have rapist suffer. It is important that you not finagle your way out of the question. Please do not say that not being able to rape is not torture. Omega has already stated that indeed there is suffering for these rapist. It matters not if you would suffer such a thing.
Disclaimer: I am searching for the truth through rationality. I do not care whether the answer is torture, specks, or rape, only that it is the truth. If the rational answer is rape, I can do nothing but accept that for I am only in search of truth and not truth that fits me.
There are implications to choosing rape as the right answer. It means that in a rational society we must allow bad things to happen if that bad thing allows for total less suffering. We have to be consistent. Omega has given you a number of rapist far far far less than 3^^^3, surely you must allow for the rape to occur.
Literally, peter_hurford ,walks into a room with 10 rapists and a victim. The rapists tell him to “go away, and don’t call the cops.”. Omega appears and says, you may stop it if you want to, but I am all knowing and know that the utility experienced by the rapist or suffering from being deprived of raping is indeed greater than the suffering of the victim. What does peter do?
You’ve essentially just constructed a Utility Monster. That’s a rather different challenge to utilitarian ethics than Torture vs. Dust Specks, though; the latter is meant to be a straightforward scope insensitivity problem, while the former strikes at total-utility maximization by constructing an intuitively repugnant situation where the utility calculations come out positive. Unfortunately it looks like the lines between them have gotten a little blurry.
I’m really starting to hate thought experiments involving rape and torture, incidentally; the social need to signal “rape bad” and “torture bad” is so strong that it often overwhelms any insight they offer. Granted, there are perfectly good reasons to test theories on emotionally loaded subjects, but when that degenerates into judging ethical philosophy mostly by how intuitively benign it appears when applied to hideously deformed edge cases, it seems like something’s gone wrong.
Unfortunately it looks like the lines between them have gotten a little blurry.
I will consider this claim, if you can show my how it is really different.
I have taken considerable care to construct a problem in which we are indeed are dealing with the trading suffering for potentially more suffering. It does not effect me one bit, that the topic has now switched from specks to rape. In fact if “detraction” happens, shouldn’t it be the burden of the person who feels detracted to explain it? I merely ask for consistency.
In my mind I choose to affiliate with the I do not know the answer camp. There is no shame in that. I have not resolved the question yet. Yet there are people for whom it is obvious to choose torture, and refuse to answer the rape question. I am consistent in that I claim not to know or not to have resolved the question yet. May I ask for the same amount of consistency?
I will consider this claim, if you can show my how it is really different.
Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities. And I really do mean fantastically huge: if the experiences are ethically commensurate at all (as is implied by most utilitarian systems of ethics), it’s large enough to swamp any reasonable discounting you might choose to perform for any reason. It also has the advantage of being relatively independent of questions of “right” or “deserving”: aside from the bare fact of their suffering, there’s nothing about either the dust-subjects or the torture-subject that might skew us one way or another. Most well-reasoned objections to TvDS boil down to finding ways to make the two options incommensurate.
Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons. On top of that, there’s a strong normative component: we’re naturally much less inclined to favor people who get their jollies from socially condemned action, even if we’ve got a quasi-omniscient being standing in front of us and saying that their suffering is large and genuine.
Long story short, about all these scenarios have in common is the idea of weighing suffering against a somehow greater suffering. Torture vs. Dust Specks was trying to throw light on a fairly specific subset of scenarios like that, of which your example isn’t a member. Nozick’s utility monster, by contrast, is doing something quite a lot like you are, i.e. leveraging an intuition pump based on a viscerally horrible utilitarian positive. I don’t see the positive vs. negative utility distinction as terribly important in this context, but if it bothers you you could easily construct a variant Utility Monster in which Utilizilla’s terrible but nonfatal hunger is temporarily assuaged by each sentient victim or something.
Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities
Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons.
I do sincerely apologize if you are offended, but rape is torture as well and Eliezer’s example can be equally if not more reprehensible.
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
I just went over how the scenarios differ from each other in considerable detail. I could repeat myself in grotesque detail, but I’m starting to think it wouldn’t buy very much for me, for you, or for anyone who might be reading this exchange.
So let’s try another angle. It sounds to me like you’re trying to draw an ethical equivalence between dust-subjects in TvDS and rapists in TVCR: more than questionable in real life, but I’ll grant that level of suffering to the latter for the sake of argument. Also misses the point of drawing attention to scope insensitivity, but that’s only obvious if you’re running a utilitarian framework already, so let’s go ahead and drop it for now. That leaves us with the mathematics of the scenarios, which do have something close to the same form.
Specifically: in both cases we’re depriving some single unlucky subject of N utility in exchange for not withholding N \ K utility divided up among several subjects for some K > 1. At this level we can establish a mapping between both thought experiments, although the exact K*, the number of subjects, and the normative overtones are vastly, sillily different between the two.
Fine so far, but you seem to be treating this as an open-and-shut argument on its own: “you surely would not let the victim [suffer]”. Well, that’s begging the question, isn’t it? From a utilitarian perspective it doesn’t matter how many people we divide up N \ K* among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering. The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
About the only way I can make sense of what you’re saying is by treating the N case—and not just for the sake of argument, but as an unquestioned base assumption—as a special kind of evil, incommensurate with any lesser crime. Which, frankly, I don’t. It all gets mapped to people’s preferences in the end, no matter how squicky and emotionally loaded the words you choose to describe it are.
From a utilitarian perspective it doesn’t matter how many people we divide up N * K among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering.
I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.
The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
Again, I feel as if you are making my argument for me. The problem is as you say obvious to the trolley problem on how we cast it.
You say my experiment is not really the same as Eliezer’s. fine. If doesn’t matter because we could just use your example. If utilitarians do not care for how many people we divide N*K with, then these utilitarians should state that they would indeed allow T to happen no matter what subject matter the K is as long as K is >1
The thing is, thought experiments are supposed to illustrate something. Right now, your proposed thought experiment is illustrating “we have trouble articulating our thoughts about rape” which is (1) obvious and (2) does not need most of the machinery in the thought experiment.
My very first reaction would be to say that you’ve stated a counterfactual… rape will never directly produce more utility than disutility. So the only way it could be moral if, somehow, unbeknown to us, this rape will somehow prevent then next Hitler from rising to power in some butterfly effect-y way that Omega knows of.
I have to trust Omega if he’s by definition infallible. If he says the utility is higher, then we still maximize it. It’s like you’re asking “Do you do the best possible action, even if the best possible action sounds intuitively wrong?”
No one ever understands just how friggin large 3^^^3 is.
One could safely argue that it is better for the entire current world population to suffer a dust speck each than for someone to get tortured for fifty years, but expand that to 3^^^3 people? Radically different story.
Would anyone challenge this? Doing the arithmetic, if dust specks cause one second of discomfort, having everyone on Earth get specked would be the equivalent of ~222 person-years of specking. If torture is at least 4.5 times as bad as specking on a per-second basis, even a straight total utilitarian calculation with constant badness per second would favor specks on that scale. That seems like a very, very easy case.
I verbally pronounce ^^^ as “trip-up”. I find this clarifies its meaning, and the practical effects of using it, considerably.
For what N would you say that {N specks better-than torture better-than N+1 specks}? If small quantities of utility or disutility have perfectly additive properties across groups of people, it should be simple to provide a concrete answer.
(Sidenote—there should be symbolic terminiology for better-than and worse-than. “>” and “<” would just be confusing in this context.)
I don’t know the precise utility values of torture vs. dust specks, but I would reason that...
Getting one dust speck is around a 1000x more preferable than being tortured for a second. There are 1,576,800,000 seconds in 50 years.
Thus, I place N roughly around 1,576,800,000,000.
Torture does not scale linearly with time. Indeed, I suspect even a simple exponential curve would understate the increase.
Wow. I thought you were going the other way with that one. The fiftieth year of torture is not nearly as damaging as the first.
That is interesting. But note that he was starting with a unit of 1 second of torture. 1 second of waterboarding is not 1/30th as distressing as 30 seconds of waterboarding. And 1 second of Chinese water torture, or the ice room, or simply isolation, is less disutility than dust speck. Actually, the particular case of isolation is one where the 50th year probably is worse than the first year, assuming the victim has not already gone completely bonkers by that point.
If you have been torturing someone for 49 years and they are not already completely bonkers then you are probably doing something wrong!
I agree with this, but I have no idea how to accurately discount it, so I decided to go linear and overestimate.
I don’t have a better idea. I really don’t have enough knowledge about how to torture people at extreme levels over a long period given the possibility of sufficiently advanced technology.
I thought you were going somewhere else with the second sentence. My natural thought, after admitting that I can’t possibly understand how big 3^^^3 is, was that if one prefers the torture for one person for 50 years, it should be true that one would also prefer to torture the entire current world population for 50 years over the, as D227 called it, the Dust Holocaust.
Yes, of course. Or even as many people as there are atoms in the universe. That’s a minor difference when we are talking about numbers like 3^^^3.
Is it? 3^^^3 isn’t all that much of a ridiculous number. Larger than the number of atoms in the universe, certainly, but not so much so that certain people’s methods of non-linear valuations of disutility per speck couldn’t make that kind of difference matter. (I tend to prefer at least 3^^^^3 for my stupid-large-numbers.)
That’s quite a bit of an understatement. 3^^4 (~10^3638334640025) is already vastly larger than the number of atoms in the universe (~10^80), 3^^5 in turn is incomprehensibly larger again, and 3^^^3 = 3^^7625597484987.
80 orders of magnitude is an extremely narrow band for a balance point to fall into when one of the numbers involved is greater by so many orders of magnitude they can’t even reasonably be expressed without a paradigm for writing big numbers about as strong as up arrow notation. Hitting the space between one human and 10^80 humans would take truly extraordinary precision. (Or making yourself money-pumpable by choosing different sides of the same deal when split in 10^80 sub-deals)
Not necessarily. For a lot of people the limit of disutility of the scenario as number of dustspecks approaches infinity is non even infinite. In such cases it is perfectly plausible—and even likely—that it is considered worse that torturing one person but not as bad as torturing 10^80 people. (In which case the extra Knuth arrow obviously doesn’t help either.)
See the comment in the parentheses. Choosing torture over 3^^^^3 dust specks, but not 3^^^3*10^-80 dust specks takes extraordinary precision. Choosing one torture over 3^^^3*10^-80 dust specks but not 10^80 tortures over 3^^^3 dust specks implies inconsistent preferences.
Your comment in the parenthesis (if you were referring to the one when you were saying requires you to be money pumpable) was false but I was letting it pass. If you are telling me to see my own comment in parentheses that says about same thing as your second sentence then, well, yes we are mostly in agreement about that part, albeit not quite to the same degree.
Just not true. It implies preferences in which 10^80 tortures is not 10^80 times worse than 1 torture. There isn’t anything inconsistent about not valuing increased instances of the same thing to be worth a different amount than the previous instance. In fact it is the usual case. It is also not exploitable—anything you can make an agent with those preferences do based on its own preferences will be something it agrees in hindsight is a good thing to do.
Can you explain how such a preference can be consistent? The total incidence of both torture and dust specks is unknown in either case. On what basis would an agent that trades one torture for avoiding 3^^^3*10^-80 dust specks refuse the same deal a second time? Or the 10^80th time? Given that 3^^^3*10^-80 people are involved it seems astronomically unlikely that the rate of torture changed noticeably even only assuming knowledge available to the agent. In any case 10^80 separate instances of the agent with no knowledge of each other would make the same deal 10^80 times, and can’t complain about being deceived since no information about the incidence of torture was assumed. Even assuming the agent only makes the deal only a single time consistency would then require that the agent prefer trading 3^^^3 dust specks for avoiding 10^80 instances of torture over trading 3^^^3*(1+10^-80) dust specks for 10^80 +1 instances of torture, which seems implausible.
Where was this declared? (Not that it matters for the purpose of this point.) The agent has prior probabilities distributed over the number of possible incidence of torture and dustspecks. It is impossible not to. And after taking one such deal those priors will be different. Sure, restricting the access to information about the current tortured population will make it harder for an agent to implement preferences that are not linear with respect to additional units but it doesn’t make those preferences inconsistent and it doesn’t stop the agent doing its best to maximise utility despite the difficulty.
There is no information on the total incidence of either included in the problem statement (other than the numbers used), and I have seen no one answer conditionally based on the incidence of either.
Yes, of course, I thought my previous comment clearly implied that?
Infinitesimally. I thought I addressed that? The problem implies the existence of an enormous number of people. Conditional on there actually being that many people the expected number of people tortured shifts by the tiniest fraction of the total. If the agent is sensitive to such a tiny shift we are back to requiring extraordinary precision.
I think that actually would be the case.
That is the crux of the problem. Bob understands just as much as you claim you understand what 3^^^3 is. Yet he chooses the “Dust Holocaust”.
First let me assume that you, peter_hurford, are a “Torturer” or rather, you are from the camp that obviously chooses 50 years. I have no doubt in my mind that you bring extremely rational and valid points to this discussions. You are poking holes in Bobs reasoning at its weakest points. This is a good thing.
I whole-heartedly concede that you have compelling points, by poking into holes into Bob’s reasons. But lets start poking around your reasoning now.
Omega has given you choice to allow or disallow 10 rapist to rape someone. Why 10 rapist? Omega knows the absolute utility across all humans, and unfortunately as terrible as it sounds, the suffering/torturing of 10 rapist not being able to rape is more suffering than what the victim feels. What do you do? 10 is < 3^^^3 suffering rapists. So lucky you, Omega need not burden you with the suffering of 3^^^3, if you chose to have rapist suffer. It is important that you not finagle your way out of the question. Please do not say that not being able to rape is not torture. Omega has already stated that indeed there is suffering for these rapist. It matters not if you would suffer such a thing.
Disclaimer: I am searching for the truth through rationality. I do not care whether the answer is torture, specks, or rape, only that it is the truth. If the rational answer is rape, I can do nothing but accept that for I am only in search of truth and not truth that fits me.
There are implications to choosing rape as the right answer. It means that in a rational society we must allow bad things to happen if that bad thing allows for total less suffering. We have to be consistent. Omega has given you a number of rapist far far far less than 3^^^3, surely you must allow for the rape to occur.
Literally, peter_hurford ,walks into a room with 10 rapists and a victim. The rapists tell him to “go away, and don’t call the cops.”. Omega appears and says, you may stop it if you want to, but I am all knowing and know that the utility experienced by the rapist or suffering from being deprived of raping is indeed greater than the suffering of the victim. What does peter do?
Edit: Grammer
You’ve essentially just constructed a Utility Monster. That’s a rather different challenge to utilitarian ethics than Torture vs. Dust Specks, though; the latter is meant to be a straightforward scope insensitivity problem, while the former strikes at total-utility maximization by constructing an intuitively repugnant situation where the utility calculations come out positive. Unfortunately it looks like the lines between them have gotten a little blurry.
I’m really starting to hate thought experiments involving rape and torture, incidentally; the social need to signal “rape bad” and “torture bad” is so strong that it often overwhelms any insight they offer. Granted, there are perfectly good reasons to test theories on emotionally loaded subjects, but when that degenerates into judging ethical philosophy mostly by how intuitively benign it appears when applied to hideously deformed edge cases, it seems like something’s gone wrong.
I will consider this claim, if you can show my how it is really different.
I have taken considerable care to construct a problem in which we are indeed are dealing with the trading suffering for potentially more suffering. It does not effect me one bit, that the topic has now switched from specks to rape. In fact if “detraction” happens, shouldn’t it be the burden of the person who feels detracted to explain it? I merely ask for consistency.
In my mind I choose to affiliate with the I do not know the answer camp. There is no shame in that. I have not resolved the question yet. Yet there are people for whom it is obvious to choose torture, and refuse to answer the rape question. I am consistent in that I claim not to know or not to have resolved the question yet. May I ask for the same amount of consistency?
Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities. And I really do mean fantastically huge: if the experiences are ethically commensurate at all (as is implied by most utilitarian systems of ethics), it’s large enough to swamp any reasonable discounting you might choose to perform for any reason. It also has the advantage of being relatively independent of questions of “right” or “deserving”: aside from the bare fact of their suffering, there’s nothing about either the dust-subjects or the torture-subject that might skew us one way or another. Most well-reasoned objections to TvDS boil down to finding ways to make the two options incommensurate.
Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons. On top of that, there’s a strong normative component: we’re naturally much less inclined to favor people who get their jollies from socially condemned action, even if we’ve got a quasi-omniscient being standing in front of us and saying that their suffering is large and genuine.
Long story short, about all these scenarios have in common is the idea of weighing suffering against a somehow greater suffering. Torture vs. Dust Specks was trying to throw light on a fairly specific subset of scenarios like that, of which your example isn’t a member. Nozick’s utility monster, by contrast, is doing something quite a lot like you are, i.e. leveraging an intuition pump based on a viscerally horrible utilitarian positive. I don’t see the positive vs. negative utility distinction as terribly important in this context, but if it bothers you you could easily construct a variant Utility Monster in which Utilizilla’s terrible but nonfatal hunger is temporarily assuaged by each sentient victim or something.
I do sincerely apologize if you are offended, but rape is torture as well and Eliezer’s example can be equally if not more reprehensible.
It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?
I just went over how the scenarios differ from each other in considerable detail. I could repeat myself in grotesque detail, but I’m starting to think it wouldn’t buy very much for me, for you, or for anyone who might be reading this exchange.
So let’s try another angle. It sounds to me like you’re trying to draw an ethical equivalence between dust-subjects in TvDS and rapists in TVCR: more than questionable in real life, but I’ll grant that level of suffering to the latter for the sake of argument. Also misses the point of drawing attention to scope insensitivity, but that’s only obvious if you’re running a utilitarian framework already, so let’s go ahead and drop it for now. That leaves us with the mathematics of the scenarios, which do have something close to the same form.
Specifically: in both cases we’re depriving some single unlucky subject of N utility in exchange for not withholding N \ K utility divided up among several subjects for some K > 1. At this level we can establish a mapping between both thought experiments, although the exact K*, the number of subjects, and the normative overtones are vastly, sillily different between the two.
Fine so far, but you seem to be treating this as an open-and-shut argument on its own: “you surely would not let the victim [suffer]”. Well, that’s begging the question, isn’t it? From a utilitarian perspective it doesn’t matter how many people we divide up N \ K* among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering. The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.
About the only way I can make sense of what you’re saying is by treating the N case—and not just for the sake of argument, but as an unquestioned base assumption—as a special kind of evil, incommensurate with any lesser crime. Which, frankly, I don’t. It all gets mapped to people’s preferences in the end, no matter how squicky and emotionally loaded the words you choose to describe it are.
I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.
Again, I feel as if you are making my argument for me. The problem is as you say obvious to the trolley problem on how we cast it.
You say my experiment is not really the same as Eliezer’s. fine. If doesn’t matter because we could just use your example. If utilitarians do not care for how many people we divide N*K with, then these utilitarians should state that they would indeed allow T to happen no matter what subject matter the K is as long as K is >1
The thing is, thought experiments are supposed to illustrate something. Right now, your proposed thought experiment is illustrating “we have trouble articulating our thoughts about rape” which is (1) obvious and (2) does not need most of the machinery in the thought experiment.
My very first reaction would be to say that you’ve stated a counterfactual… rape will never directly produce more utility than disutility. So the only way it could be moral if, somehow, unbeknown to us, this rape will somehow prevent then next Hitler from rising to power in some butterfly effect-y way that Omega knows of.
I have to trust Omega if he’s by definition infallible. If he says the utility is higher, then we still maximize it. It’s like you’re asking “Do you do the best possible action, even if the best possible action sounds intuitively wrong?”