I would cling to the small chance of living until that chance gets extremely tiny. I can’t pinpoint how tiny it would have to be because I’m a human and humans suck at numbers.
What do you currently do in the dice-roll cases, like driving or crossing the road?
I don’t do any sophisticated calculations. I just try to avoid accidents. What are you trying to get from my answer to that question?
What number is not small? What number of people you care about is not small?
I would sacrifice myself to prevent the entire human civilization from collapsing. I would not sacrifice myself to save 1000 other people. That leaves quite a large range, and I haven’t pinned down where the breakeven point is. Deciding whether or not to sacrifice myself to save 10^5 other people is a lot harder than deciding whether or not to sacrifice myself to save 5 other people.
Assume that there is no danger of anyone finding out, or that the judge is a perfect utilitarian, so this is not a consideration.
I already said that I would kill one person to save five in the idealized trolley problem. My point was that if something like the trolley problem actually happened to me, it would not be the idealized trolley problem, and those assumptions you mention are false in real life, so I would not assume them while making my decision.
Edit: It’s worth pointing out that people face opportunities to sacrifice their own welfare for others at much better than 1000:1 ratios all the time, and no one takes them except for a few weirdos like Toby Ord.
I would cling to the small chance of living until that chance gets extremely tiny. I can’t pinpoint how tiny it would have to be because I’m a human and humans suck at numbers.
I don’t do any sophisticated calculations. I just try to avoid accidents. What are you trying to get from my answer to that question?
I would sacrifice myself to prevent the entire human civilization from collapsing. I would not sacrifice myself to save 1000 other people. That leaves quite a large range, and I haven’t pinned down where the breakeven point is. Deciding whether or not to sacrifice myself to save 10^5 other people is a lot harder than deciding whether or not to sacrifice myself to save 5 other people.
I already said that I would kill one person to save five in the idealized trolley problem. My point was that if something like the trolley problem actually happened to me, it would not be the idealized trolley problem, and those assumptions you mention are false in real life, so I would not assume them while making my decision.
Edit: It’s worth pointing out that people face opportunities to sacrifice their own welfare for others at much better than 1000:1 ratios all the time, and no one takes them except for a few weirdos like Toby Ord.
http://www.smbc-comics.com/index.php?db=comics&id=2980#comic