I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
I think that violates the spirit of the thought experiment. The point of the dust speck is that it is a fleeting, momentary discomfort with no consequences beyond itself. So if you multiply the choice by a billion, I would say that the billion dust specks should aggregate in a way they don’t pile up and “completely shred one person”—e.g., each person gets one dust speck per week. This doesn’t help solving the dilemma, at least for me.
Ok, then it doesn’t solve the torture vs dust specks. But it does solve many analogous problems, like 0.5 sec torture for many people vs 50 years for one person, for example.
I touched on the idea here: http://lesswrong.com/lw/1d5/expected_utility_without_the_independence_axiom/
But it’s important to note that there is no analogue to that in population ethics. I think I’ll make a brief post on that.