When you, personally, decide between your future containing a dust speck at unknown moment and some alternative, the value of that dust speck won’t be significantly affected by the probability of it causing trouble, if probability is low enough.
You could replace a dust speck with 1 in 3^^^3/1000 probability of being tortured for 50 years, so that it’s a choice between 3^^^3 people each having a 3^^^3/1000 probability of being tortured, and one person being tortured with certainty, or, derandomizing, a choice between 1000 people tortured and one person tortured. That one person is better be really special, for the proximity effect to elevate them above all those other people.
It can’t be invalid: just replace the initial rule by this: of all 3^^^3, a random selection of 1000 will be made who are to be tortured. Given this rule, each individual has about 1 in 3^^^3/1000 probability of getting selected for torture, which is presumably even better deal than a certain speck. This is compared to choosing one person to torture with certainty. The proximity effect may say that those 1000 people are from far away and so of little importance, which I mentioned in the comment above. I don’t think the choice of saving one known person over a thousand ridiculously-far-away people is necessarily incorrect though.
Sure, makes sense. I imagine the probability is much less than 3^^^3/1000 of the consequences I’m hypothesizing, though, which makes the dust specks still worse.
When you, personally, decide between your future containing a dust speck at unknown moment and some alternative, the value of that dust speck won’t be significantly affected by the probability of it causing trouble, if probability is low enough.
You could replace a dust speck with 1 in 3^^^3/1000 probability of being tortured for 50 years, so that it’s a choice between 3^^^3 people each having a 3^^^3/1000 probability of being tortured, and one person being tortured with certainty, or, derandomizing, a choice between 1000 people tortured and one person tortured. That one person is better be really special, for the proximity effect to elevate them above all those other people.
The proximity effect, as described in the post, makes your “derandomizing” step invalid.
It can’t be invalid: just replace the initial rule by this: of all 3^^^3, a random selection of 1000 will be made who are to be tortured. Given this rule, each individual has about 1 in 3^^^3/1000 probability of getting selected for torture, which is presumably even better deal than a certain speck. This is compared to choosing one person to torture with certainty. The proximity effect may say that those 1000 people are from far away and so of little importance, which I mentioned in the comment above. I don’t think the choice of saving one known person over a thousand ridiculously-far-away people is necessarily incorrect though.
Yes, this way is correct. I thought you implied the 1000 people were close, not far away.
Sure, makes sense. I imagine the probability is much less than 3^^^3/1000 of the consequences I’m hypothesizing, though, which makes the dust specks still worse.