It seems to me that the idea of a critical threshold of suffering might be relevant. Most dust-speckers seem to maintain that a dust speck is always a negligible effect—a momentary discomfort that is immediately forgotten—but in a sufficiently large group of people, randomly selected, a low-probability situation in which a dust speck is critical could arise. For example, the dust speck could be a distraction while operating a moving vehicle, leading to a crash. Or the dust speck could be an additional frustration to an individual already deeply frustrated, leading to an outburst. Each conditional in these hypotheticals is improbable, but multiplying them out surely doesn’t result in a number as large as 3^^^3, which means that it is highly likely that many of them will occur. Under this interpretation, the torture is the obvious winner.
If cascading consequences are ruled out, however, I’ll have to think some more.
When you, personally, decide between your future containing a dust speck at unknown moment and some alternative, the value of that dust speck won’t be significantly affected by the probability of it causing trouble, if probability is low enough.
You could replace a dust speck with 1 in 3^^^3/1000 probability of being tortured for 50 years, so that it’s a choice between 3^^^3 people each having a 3^^^3/1000 probability of being tortured, and one person being tortured with certainty, or, derandomizing, a choice between 1000 people tortured and one person tortured. That one person is better be really special, for the proximity effect to elevate them above all those other people.
It can’t be invalid: just replace the initial rule by this: of all 3^^^3, a random selection of 1000 will be made who are to be tortured. Given this rule, each individual has about 1 in 3^^^3/1000 probability of getting selected for torture, which is presumably even better deal than a certain speck. This is compared to choosing one person to torture with certainty. The proximity effect may say that those 1000 people are from far away and so of little importance, which I mentioned in the comment above. I don’t think the choice of saving one known person over a thousand ridiculously-far-away people is necessarily incorrect though.
Sure, makes sense. I imagine the probability is much less than 3^^^3/1000 of the consequences I’m hypothesizing, though, which makes the dust specks still worse.
For the original formulation of the problem, assume no cascading consequences and replace “dust speck” with “minimal non-negligible amount of suffering” as in the first point of the post.
It seems to me that the idea of a critical threshold of suffering might be relevant. Most dust-speckers seem to maintain that a dust speck is always a negligible effect—a momentary discomfort that is immediately forgotten—but in a sufficiently large group of people, randomly selected, a low-probability situation in which a dust speck is critical could arise. For example, the dust speck could be a distraction while operating a moving vehicle, leading to a crash. Or the dust speck could be an additional frustration to an individual already deeply frustrated, leading to an outburst. Each conditional in these hypotheticals is improbable, but multiplying them out surely doesn’t result in a number as large as 3^^^3, which means that it is highly likely that many of them will occur. Under this interpretation, the torture is the obvious winner.
If cascading consequences are ruled out, however, I’ll have to think some more.
When you, personally, decide between your future containing a dust speck at unknown moment and some alternative, the value of that dust speck won’t be significantly affected by the probability of it causing trouble, if probability is low enough.
You could replace a dust speck with 1 in 3^^^3/1000 probability of being tortured for 50 years, so that it’s a choice between 3^^^3 people each having a 3^^^3/1000 probability of being tortured, and one person being tortured with certainty, or, derandomizing, a choice between 1000 people tortured and one person tortured. That one person is better be really special, for the proximity effect to elevate them above all those other people.
The proximity effect, as described in the post, makes your “derandomizing” step invalid.
It can’t be invalid: just replace the initial rule by this: of all 3^^^3, a random selection of 1000 will be made who are to be tortured. Given this rule, each individual has about 1 in 3^^^3/1000 probability of getting selected for torture, which is presumably even better deal than a certain speck. This is compared to choosing one person to torture with certainty. The proximity effect may say that those 1000 people are from far away and so of little importance, which I mentioned in the comment above. I don’t think the choice of saving one known person over a thousand ridiculously-far-away people is necessarily incorrect though.
Yes, this way is correct. I thought you implied the 1000 people were close, not far away.
Sure, makes sense. I imagine the probability is much less than 3^^^3/1000 of the consequences I’m hypothesizing, though, which makes the dust specks still worse.
For the original formulation of the problem, assume no cascading consequences and replace “dust speck” with “minimal non-negligible amount of suffering” as in the first point of the post.