Yes, I did under specify my answer. Let’s assume that a billion dust specks will completely shred one person.
Then if have a specific population (key assumption) of 3^^^3 people and face the same decision a billion times, then you have the choice between a billion tortures and 3^^^3 deaths.
If you want to avoid comparing different negatives, figure out how many dust specks impacts (and at what rate) equivalent to 50 years of torture, painwise, and apply a similar argument.
I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
Yes, I did under specify my answer. Let’s assume that a billion dust specks will completely shred one person.
Then if have a specific population (key assumption) of 3^^^3 people and face the same decision a billion times, then you have the choice between a billion tortures and 3^^^3 deaths.
If you want to avoid comparing different negatives, figure out how many dust specks impacts (and at what rate) equivalent to 50 years of torture, painwise, and apply a similar argument.
I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
I touched on the idea here: http://lesswrong.com/lw/1d5/expected_utility_without_the_independence_axiom/
But it’s important to note that there is no analogue to that in population ethics. I think I’ll make a brief post on that.