Another dilemma where the same dichotomy applies is torture vs. dust specks. One might reason that torturing one person 50 years is better than torturing 100 people infinitesimally less painfully for 50 years minus one second, and that this is better than torturing 10,000 people very slightly less painfully for 50 years minus two seconds……. and at the end of this process accept the unintuituive conclusion that torturing someone 50 years is better than having a huge number of people suffer a tiny pain for a second (differential thinking). Or one might refuse to accept the conclusion and decide that one of these apparently unproblematic differential comparisons is in fact wrong (integral thinking).
(nods) That said, “integral thinking” is difficult to apply consistently to thought-experimental systems as completely divorced from anything like my actual life as TvDS.
I find in practice that when I try, I mostly just end up ignoring the posited constraints of the thought-experimental system—what is sometimes called “fighting the hypothetical” around here.
For example, when I try to apply “integral thinking” to TvDS to reject the unintuitive conclusion, I end up applying intuitions developed from life in a world with a conceivable number of humans, where my confidence that the suffering I induce will alleviate a greater amount of suffering elsewhere is pretty low, to a thought-experimental world with an inconceivable number of humans where my confidence is extremely high.
Torture vs dust specks has other features—in particular, the fact that “torture” is clearly the right option under aggregation (if you expect to to face the same problem 3^^^3 times).
Yes, I did under specify my answer. Let’s assume that a billion dust specks will completely shred one person.
Then if have a specific population (key assumption) of 3^^^3 people and face the same decision a billion times, then you have the choice between a billion tortures and 3^^^3 deaths.
If you want to avoid comparing different negatives, figure out how many dust specks impacts (and at what rate) equivalent to 50 years of torture, painwise, and apply a similar argument.
I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
Another dilemma where the same dichotomy applies is torture vs. dust specks. One might reason that torturing one person 50 years is better than torturing 100 people infinitesimally less painfully for 50 years minus one second, and that this is better than torturing 10,000 people very slightly less painfully for 50 years minus two seconds……. and at the end of this process accept the unintuituive conclusion that torturing someone 50 years is better than having a huge number of people suffer a tiny pain for a second (differential thinking). Or one might refuse to accept the conclusion and decide that one of these apparently unproblematic differential comparisons is in fact wrong (integral thinking).
(nods) That said, “integral thinking” is difficult to apply consistently to thought-experimental systems as completely divorced from anything like my actual life as TvDS.
I find in practice that when I try, I mostly just end up ignoring the posited constraints of the thought-experimental system—what is sometimes called “fighting the hypothetical” around here.
For example, when I try to apply “integral thinking” to TvDS to reject the unintuitive conclusion, I end up applying intuitions developed from life in a world with a conceivable number of humans, where my confidence that the suffering I induce will alleviate a greater amount of suffering elsewhere is pretty low, to a thought-experimental world with an inconceivable number of humans where my confidence is extremely high.
Torture vs dust specks has other features—in particular, the fact that “torture” is clearly the right option under aggregation (if you expect to to face the same problem 3^^^3 times).
The “clearly” is not at all clear to me, could you explain?
Yes, I did under specify my answer. Let’s assume that a billion dust specks will completely shred one person.
Then if have a specific population (key assumption) of 3^^^3 people and face the same decision a billion times, then you have the choice between a billion tortures and 3^^^3 deaths.
If you want to avoid comparing different negatives, figure out how many dust specks impacts (and at what rate) equivalent to 50 years of torture, painwise, and apply a similar argument.
I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
I touched on the idea here: http://lesswrong.com/lw/1d5/expected_utility_without_the_independence_axiom/
But it’s important to note that there is no analogue to that in population ethics. I think I’ll make a brief post on that.
I thought it’s an excellent example of differential vs integral and the Sorites paradox.