I just want to say thanks to everyone for your comments and I now realize the obvious flaw of incorporating any extremely personal connection into a mathematical morality calculation. Because, as BlueSun pointed out that causes problems on whatever scale of pain involved.
if you were faced with your Option 1: Save 400 Lives or Option 2: Save 500 Lives with 90% probability, would you seriously take option 2 if your loved ones were included in the 400? I wouldn’t. Faced with statistical people I’d take option 2 every time. But make Option 1: Save 3 lives and those three lives are your kids or option 2: Save 500 statistical lives with 90% probability I don’t think I’d hesitate to pick my kids.
I also learned not to grandstand on morality questions. Sorry, about the “would you do it? really?” argument, I won’t do that again.
However, I still fall on the side of the dust specks after rethinking the issue, but due to the reasoning that the 3^^^3 individuals would probably be willing to suffer the dust specks to save someone from torture, while the tortured person wouldn’t likely be willing to be tortured to save others from dust specks.
I’m somewhat sympathetic to your position, but I’m curious:
1) Which side do you think you’d come down on if it were (3^^^3 / 1 billion) dust specks vs (50 / 1 billion) years (= 1.6 seconds) of torture?
2) How about the same (3^^^3 / 1 billion) dust specks and (50 / 1 billion) years of torture but the dust specks were divided among (3^^^3 / (billion^2)) people so that each received 1 billion dust specks?
EDIT: I think these questions weren’t very clear about what I was getting at. Eliezer’s argument from Circular Altruism is along the lines of what I was going for, but much more well developed:
But let me ask you this. Suppose you had to choose between one person being tortured for 50 years, and a googol people being tortured for 49 years, 364 days, 23 hours, 59 minutes and 59 seconds. You would choose one person being tortured for 50 years, I do presume; otherwise I give up on you.
And similarly, if you had to choose between a googol people tortured for 49.9999999 years, and a googol-squared people being tortured for 49.9999998 years, you would pick the former.
A googolplex is ten to the googolth power. That’s a googol/100 factors of a googol. So we can keep doing this, gradually—very gradually—diminishing the degree of discomfort, and multiplying by a factor of a googol each time, until we choose between a googolplex people getting a dust speck in their eye, and a googolplex/googol people getting two dust specks in their eye.
If you find your preferences are circular here, that makes rather a mockery of moral grandstanding.
The main argument is predicated on linearity of probability. Probability is linear. I was pointing out the way the suggested comparison does not satisfy this.
I am basing my reasoning on the probable preferences of those involved, so my answer would depend on the feelings of the people to being dust specked/tortured.
I’m not entirely clear what exactly you are asking with number 1: are you just asking 1.6 seconds of torture vs. 3^^^3/ 1 billion dust specks? If so, I’m essentially indifferent, it seems like both are fairly inconsequential as long as the torture only causes pain for the 1.6 seconds.
For number 2, a billion dust specks would probably get to be fairly noticeable in succession, so I’d prefer to get 1.6 seconds of torture over with, because that isn’t really enough time for it actually to really be torturous (depending on what exactly that torture was) rather than deal with a constant annoyance.
I just want to say thanks to everyone for your comments and I now realize the obvious flaw of incorporating any extremely personal connection into a mathematical morality calculation. Because, as BlueSun pointed out that causes problems on whatever scale of pain involved.
I also learned not to grandstand on morality questions. Sorry, about the “would you do it? really?” argument, I won’t do that again.
However, I still fall on the side of the dust specks after rethinking the issue, but due to the reasoning that the 3^^^3 individuals would probably be willing to suffer the dust specks to save someone from torture, while the tortured person wouldn’t likely be willing to be tortured to save others from dust specks.
I’m somewhat sympathetic to your position, but I’m curious:
1) Which side do you think you’d come down on if it were (3^^^3 / 1 billion) dust specks vs (50 / 1 billion) years (= 1.6 seconds) of torture?
2) How about the same (3^^^3 / 1 billion) dust specks and (50 / 1 billion) years of torture but the dust specks were divided among (3^^^3 / (billion^2)) people so that each received 1 billion dust specks?
EDIT: I think these questions weren’t very clear about what I was getting at. Eliezer’s argument from Circular Altruism is along the lines of what I was going for, but much more well developed:
Well, torture is highly nonlinear, so utility((50 years / billion) of torture ) is much milder than utility(50 years of torture)/billion.
As for #2, the you’re leaving the LCPW for the original problem. Dustspecks are also nonlinear.
Hmm, going back and reading Circular Altruism, Eliezer’s argument really seems to be predicated on linearity, doesn’t it?
EDIT: Oops, read it again. You guys are right, it’s not.
The main argument is predicated on linearity of probability. Probability is linear. I was pointing out the way the suggested comparison does not satisfy this.
It’s not.
I am basing my reasoning on the probable preferences of those involved, so my answer would depend on the feelings of the people to being dust specked/tortured.
I’m not entirely clear what exactly you are asking with number 1: are you just asking 1.6 seconds of torture vs. 3^^^3/ 1 billion dust specks? If so, I’m essentially indifferent, it seems like both are fairly inconsequential as long as the torture only causes pain for the 1.6 seconds.
For number 2, a billion dust specks would probably get to be fairly noticeable in succession, so I’d prefer to get 1.6 seconds of torture over with, because that isn’t really enough time for it actually to really be torturous (depending on what exactly that torture was) rather than deal with a constant annoyance.