This isn’t a hard problem at all. I would push someone else onto the tracks (in the idealized, hypothetical, trolley problem) but I wouldn’t jump. The reason is that pushing the guy onto the tracks isn’t about doing the Right Thing™; it’s about getting what I want. I want as many people as possible to live, but I care about my own life a lot more than the lives of small numbers of other people. It shouldn’t be too hard to predict my answers to each of your variants based on this.
I would take no action in any real-life trolley problem unless there were a lot more lives at stake, because I care much more about not getting convicted of murder than I do about a mere 4 expected lives of people I don’t know, and I think my chances are better if I take no action.
The reason is that pushing the guy onto the tracks isn’t about doing the Right Thing™; it’s about getting what I want. I want as many people as possible to live
That feeling of wanting to help people is what is referred to as “morality”.
I would cling to the small chance of living until that chance gets extremely tiny. I can’t pinpoint how tiny it would have to be because I’m a human and humans suck at numbers.
What do you currently do in the dice-roll cases, like driving or crossing the road?
I don’t do any sophisticated calculations. I just try to avoid accidents. What are you trying to get from my answer to that question?
What number is not small? What number of people you care about is not small?
I would sacrifice myself to prevent the entire human civilization from collapsing. I would not sacrifice myself to save 1000 other people. That leaves quite a large range, and I haven’t pinned down where the breakeven point is. Deciding whether or not to sacrifice myself to save 10^5 other people is a lot harder than deciding whether or not to sacrifice myself to save 5 other people.
Assume that there is no danger of anyone finding out, or that the judge is a perfect utilitarian, so this is not a consideration.
I already said that I would kill one person to save five in the idealized trolley problem. My point was that if something like the trolley problem actually happened to me, it would not be the idealized trolley problem, and those assumptions you mention are false in real life, so I would not assume them while making my decision.
Edit: It’s worth pointing out that people face opportunities to sacrifice their own welfare for others at much better than 1000:1 ratios all the time, and no one takes them except for a few weirdos like Toby Ord.
This isn’t a hard problem at all. I would push someone else onto the tracks (in the idealized, hypothetical, trolley problem) but I wouldn’t jump. The reason is that pushing the guy onto the tracks isn’t about doing the Right Thing™; it’s about getting what I want. I want as many people as possible to live, but I care about my own life a lot more than the lives of small numbers of other people. It shouldn’t be too hard to predict my answers to each of your variants based on this.
I would take no action in any real-life trolley problem unless there were a lot more lives at stake, because I care much more about not getting convicted of murder than I do about a mere 4 expected lives of people I don’t know, and I think my chances are better if I take no action.
That feeling of wanting to help people is what is referred to as “morality”.
No, it is referred to as altruism. Morality is a fuzzy grouping of concepts around aggregate preferences.
Well, it’s at the very least a part of morality, anyway.
Yeah, altruism is a more precise term, but it sounds less … punchy compared to “morality”.
I don’t believe you:
What would you do in the dice-roll cases? What do you currently do in the dice-roll cases, like driving or crossing the road?
What number is not small? What number of people you care about is not small?
Assume that there is no danger of anyone finding out, or that the judge is a perfect utilitarian, so this is not a consideration.
I would cling to the small chance of living until that chance gets extremely tiny. I can’t pinpoint how tiny it would have to be because I’m a human and humans suck at numbers.
I don’t do any sophisticated calculations. I just try to avoid accidents. What are you trying to get from my answer to that question?
I would sacrifice myself to prevent the entire human civilization from collapsing. I would not sacrifice myself to save 1000 other people. That leaves quite a large range, and I haven’t pinned down where the breakeven point is. Deciding whether or not to sacrifice myself to save 10^5 other people is a lot harder than deciding whether or not to sacrifice myself to save 5 other people.
I already said that I would kill one person to save five in the idealized trolley problem. My point was that if something like the trolley problem actually happened to me, it would not be the idealized trolley problem, and those assumptions you mention are false in real life, so I would not assume them while making my decision.
Edit: It’s worth pointing out that people face opportunities to sacrifice their own welfare for others at much better than 1000:1 ratios all the time, and no one takes them except for a few weirdos like Toby Ord.
http://www.smbc-comics.com/index.php?db=comics&id=2980#comic