Perhaps I am one of the “sentimentally irrational,” but I would pick the 400 certain lives saved if it were a one-time choice, and the 500 @ 90% if it were an iterated choice I had to make over, and over again. In the long run, probabilities would take hold, and many more people would be saved. But for a single instance of an event never to be repeated? I’d save the 400 for certain.
Your 80% and 90% figures don’t really add up either. You don’t describe how many people in total will die, regardless of you decision. If the max death number possible from this catastrophe is 500, then your point is valid. But what if it were 100 million, or even better, all of humanity? Now, the difference in chance of saving your loved one via either strategy is vanishingly small, and you are left with a 90% chance or a 100% chance of saving humanity as a whole. It’s exactly the same as the situation you describe above, but it seems the moral math reverses itself. You need to more fully specify your hypothetical situations if you wish to make a convincing point.
(same anon from above who asked about the context of the 400⁄500 problem being an issue)
In response to GreedyAlgorithm who said:
Certainly finding out all of the facts that you can is good. But rationality has to work no matter how many facts you have. If the only thing you know is that you have two options:
Save 400 lives, with certainty
Save 500 lives, 90% probability; save no lives, 10% probability. then you should take option 2. Yes, more information might change your choice. Obviously. And not interesting. The point is that given this information, rationality picks choice 2.
While I agree with your constrained view of the problem and its analysis, you are trying to have your cake and eat it too. In such a freed-from-context view, this is (to use your own words) “not interesting”. It’s like asserting that “4.5 is greater than 4″ and that since we wish to pick the greater number, the rationalist picks 4.5. True as far as it goes, but trivial and of no consequence.
Eliezer brought in the idea of something more valuable than your own life, say that of your child. By stepping outside the cold, hard calculus of mere arithmetic comparisons he made a good point (we are still discussing it), but he opened the door for me to do the same. I see your child, and raise you “all of humanity”.
Either we are discussing a tautological, uninteresting, degenerate case which reduces down to “4.5 is greater than 4, so to be rational you should always pick 4.5” (which, I agree with, but is rather pointless) or we are discussing the more interesting question of the intersection between morality and rationality. In that case, I assert bringing “extra” conditions into the problem matters very much.
If “rationality has to work no matter how many facts you have” [Greedy’s words] (which I agree with) then you must grant me that it should provide consistent results. To make the problem “interesting” Eliezer brought in the “extra” personal stake of a loved family member, and came to his rationalist conclusion, pointing out why you’d want to “take a chance” given that you don’t know if your daughter is in the certain group or might be saved as one of the “chance” group. I merely followed his example. His daughter may still be in the certain group or not (same situation) but I’ve just added everyone else’s daughter into the pot. I don’t see how these are fundamentally different cases, so rationality should produce the same answer, no?