Suppose some fraction of the 3^^^3 dropped out. How many dust specks would you be willing to take? Two? Ten? A thousand? A million? A billion? That’s half a millimeter in diameter, now, and we’re only at 10^9. How about 10^12? 10^15? 10^18? We’re around half a meter in diameter now, approaching or exceeding the size of a football, and we’ve not even reached 3^^4 - and remember that 3^^^3 is 3^^3^^3 = 3^^7,625,597,484,987.
What, you think that all of the 3^^^3 will go for it? All of them, chipping in to save one person who was getting 50 years of torture? In a universe with 3^^^3 people in it, how many people do you think are being tortured? Our planet has had around 10^11 human beings in history. If we say that only one of those 10^11 people were ever tortured for 50 years in history—or even that there were a one-in-a-thousand chance of it, one in 10^14 - how many people would be tortured for 50 years among the more than 3^^^3 we are positing? And do you think that all 3^^^3 will choose the same one you did?
Would you consider think that, perhaps, one dust speck is a bit much to pay to save one part in 3^^^3 of a victim?
Would you consider think that, perhaps, one dust speck is a bit much to pay to save one part in 3^^^3 of a victim?
When multiple agents coordinate, their decision delivers the whole outcome, not a part of it. Depending on what you decide, everyone who reasons similarly will decide. Thus, you have the absolute control over what outcome to bring about, even if you are only one of a gazillion like-minded voters.
Here, you decide whether to save one person, at the cost of harming 3^^^3 people. This is not equivalent to saving 1/3^^^3 of a person at the cost of harming one person, because the saving of 1/3^^^3 of a person is not something that actually could happen, it is at best utilitarian simplification, which you must make explicit and not confuse for a decision-theoretic construction.
If it were a one-shot deal with no cheaper alternative, I could see agreeing. But that still leaves the other 3^^^3/10^14 victims and this won’t scale to deal with those.
Suppose some fraction of the 3^^^3 dropped out. How many dust specks would you be willing to take? Two? Ten? A thousand? A million? A billion? That’s half a millimeter in diameter, now, and we’re only at 10^9. How about 10^12? 10^15? 10^18? We’re around half a meter in diameter now, approaching or exceeding the size of a football, and we’ve not even reached 3^^4 - and remember that 3^^^3 is 3^^3^^3 = 3^^7,625,597,484,987.
What, you think that all of the 3^^^3 will go for it? All of them, chipping in to save one person who was getting 50 years of torture? In a universe with 3^^^3 people in it, how many people do you think are being tortured? Our planet has had around 10^11 human beings in history. If we say that only one of those 10^11 people were ever tortured for 50 years in history—or even that there were a one-in-a-thousand chance of it, one in 10^14 - how many people would be tortured for 50 years among the more than 3^^^3 we are positing? And do you think that all 3^^^3 will choose the same one you did?
Would you consider think that, perhaps, one dust speck is a bit much to pay to save one part in 3^^^3 of a victim?
When multiple agents coordinate, their decision delivers the whole outcome, not a part of it. Depending on what you decide, everyone who reasons similarly will decide. Thus, you have the absolute control over what outcome to bring about, even if you are only one of a gazillion like-minded voters.
Here, you decide whether to save one person, at the cost of harming 3^^^3 people. This is not equivalent to saving 1/3^^^3 of a person at the cost of harming one person, because the saving of 1/3^^^3 of a person is not something that actually could happen, it is at best utilitarian simplification, which you must make explicit and not confuse for a decision-theoretic construction.
If it were a one-shot deal with no cheaper alternative, I could see agreeing. But that still leaves the other 3^^^3/10^14 victims and this won’t scale to deal with those.