Especially if you call yourself a utilitarian, as many folks here do, how can you not push?
Some are utilitarian. Most are consequentialist with some degree of altruistic preference.
Have your answer?
Flip. Push. (All else being unrealistically equal.)
Good. Now comes the third, final, and hardest question; especially for anybody who said they’d push the fat man. There is still no switch or alternate track. The trolley is still coming down the tracks, and there are still five people tied to it. You are still standing on a bridge over the tracks. But this time you’re alone and the only way to stop the train is by jumping in front of it yourself. Do you jump?
No. I don’t want to kill myself. I would rather the victims of the psychopath lived than died, all else being equal. But I care about my own life more than 5 unknown strangers. The revealed preferences of the overwhelming majority of other humans is similar. The only way this question is ‘hard’ is that it could take some effort to come up with answers that sound virtuous.
If you said yes, you would push the fat man; but you won’t jump. Why?
I’m not a utilitarian. I care more about my life than about the overwhelming majority of combinations of 5 other people. There are exceptions. People I like or admire and people who are instrumentally useful for contributing to the other altruistic causes I care for. Those are the groups of 5 that I would be willing to sacrifice myself for.
Do you have a moral obligation to jump in front of the train?
No. (And anyone who credibly tried to force that moral onto me or those I cared about could be considered a threat and countered appropriately.)
If you have a moral obligation to push someone else, don’t you have a moral obligation to sacrifice yourself as well?
No. That doesn’t follow. It is also an error to equivocate between “I would push the fat man” and “I concede that I have a moral obligation to push the fat man”.
or if you won’t sacrifice yourself, how can you justify sacrificing someone else?
Exactly the same way you would justify sacrificing someone else if you would sacrifice yourself.
Do you have your answers? Are you prepared to defend them?
Defend them? Heck no. I may share my answers with someone who is curious. But defending them would imply that my decision to not commit suicide to save strangers somehow requires your permission or agreement.
But be forewarned, in part 2 I’m going to show you an actual, non-hypothetical scenario where this problem becomes very real; indeed a situation I know many LessWrong readers are facing right now; and yes, it’s a matter of life and death.
So you had a specific agenda in mind. I pre-emptively reject whatever demands you are making of me via this style of persuasion and lend my support to anyone else who is morally pressured toward martyrdom.
You know, most people have a point in mind when they start writing something. It’s not some sort of underhanded tactic.
Also, your own life by definition has greater instrumental value than others’ because you can effect it. No non-virtuous sounding preferences required; certainly no trying to go from “revealed preferences” to someone’s terminal values because obviously everyone who claims to be akraisic or, y’know, realizes they were biased and acts to prevent it is just signalling.
You know, most people have a point in mind when they start writing something. It’s not some sort of underhanded tactic.
Not something I claimed. I re-assert my previous position. I oppose the style of persuasion used in the grandparent. Specifically, the use of a chain of connotatively-fallacious rhetorical questions.
Also, your own life by definition has greater instrumental value than others’ because you can effect it.
That is:
Not something that follows by definition.
Plainly false as a general claim. There often going to be others that happen to have more instrumental value for achieving many instrumental goals for influencing the universe. For example if someone cares about the survival of humanity a lot (ie. more than about selfish goals) then the life of certain people who are involved in combating existential risk are likely to be more instrumentally useful for said someone than their own.
Not something I claimed. I re-assert my previous position. I oppose the style of persuasion used in the grandparent. Specifically, the use of a chain of connotatively-fallacious rhetorical questions.
That’s a lovely assertion and all, but I wasn’t responding to it, sorry. (I didn’t find the questions all that fallacious, myself; just a little sloppy.) Immediately before that statement you said “So you had a specific agenda in mind.”
It was this, and the (perceived?) implications in light of the context, that I meant to reply to. Sorry if that wasn’t clear.
There often going to be others that happen to have more instrumental value for achieving many instrumental goals for influencing the universe. For example if someone cares about the survival of humanity a lot (ie. more than about selfish goals) then the life of certain people who are involved in combating existential risk are likely to be more instrumentally useful for said someone than their own.
Oh, come on. I didn’t say it was more instrumentally valuable than any conceivable other resource. It has greater instrumental value than other lives. Individual lives may come with additional resources based on the situation.
That’s like responding to the statement “guns aren’t instrumentally useful for avoiding attackers because you’re more likely to injure yourself than an attacker” with “but what if that gun was the only thing standing between a psychopath and hundreds of innocent civilians? What if it was a brilliant futuristic gun that knew not to fire unless it was pointing at a certified Bad Person? It would be useful then!”
If someone says something that sounds obviously wrong, maybe stop and consider that you might be misinterpreting it? Principle of charity and all that.
(I really hope I don’t turn out to have misinterpreted you, that would be too ironic.)
I didn’t find the questions all that fallacious, myself; just a little sloppy.
A complementary explanation to the ones I have already given you is that this post is optimised for persuading people like yourself, not people like me. I prefer a state where posts use styles of reasoning more likely to be considered persuasive by people like myself. As such, I oppose this post.
Why are you against diversity?! We should have posts for both people-like-you and people-like-me! Stop trying to monopolise LessWrong, people-like-wedrifrid!!
EDIT: This has been a joke. We now return you to your regularly scheduled LessWrong.
I’m a bit puzzled by this. If you care about yourself more than the five strangers, then why push or flip?
Pushing is going to get you prosecuted for murder in most jurisdictions, and could even attract a death sentence in some of them. Flipping is less clear: you could get a manslaughter charge, or be sued by the family of the one tied to the alternate track. The five you saved might decide to contribute to your defence fund, but good luck with that.
Or suppose you construct the hypothetical so there is no legal comeback: still, why do you want to push a fat man off a bridge? It takes energy, you could pull a muscle, he could notice and hit back or pull you over too etc. etc.
You’re being either really blind or deliberately obtuse. Caring more about your life than the life of five strangers doesn’t mean you care infinitely more about yourself than you do about them. Maybe you’ll pull a muscle flipping the switch? it’s entirely legitimate to say that you’ll take some costs upon yourself to do a big favor for 5 strangers without being willing to take the ultimate cost upon yourself.
Apologies. You are quite right, I was indeed being “really blind” and pretty obtuse as well (though not deliberately so). I’ve now spotted that the original poster explicitly said to ignore alll chances that the fat man would fight back, and presumably that extends to other external costs, such as retaliation by his relatives, the law etc. My bad.
I’ve also commented on this further down this thread. I now find my moral intuitions behaving very strangely in this scenario. I strongly suspect that my original intuitions were very closely related to all these knock-on factors which I’ve now been asked to ignore.
No I was pointing out that in all realistic ways of constructing the hypothetical there are going to be quite major risks and costs to oneself in pushing the fat man: an obvious one being that he easily could fight back. This may indeed be one of the factors behind different moral intuitions. (We have no instincts about the cost-to-self of flipping a switch: although that could also be very high in the modern world, it takes some thinking to realise it).
For what it’s worth, my own answers are “no flip, no push and no jump” for precisely such reasons: all too risky to self. Though if I had family members or close friends on the lines, I’d react differently. If there were a hundred or a thousand people on the line, I’d probably react differently.
No I was pointing out that in all realistic ways of constructing the hypothetical there are going to be quite major risks and costs to oneself in pushing the fat man
I’m guessing wedrifid isn’t taking that into account because we were explicitly asked not to do that here:
Try not to Kobayashi Maru this question, at least not yet. I know you can criticize the scenario and find it unrealistic.
Thanks for the patient reminder to read the entire original post before jumping into commenting on the comments. I did in fact miss all the caveats about wheelchairs, light rolling, fat man being anaesthetised etc. Doh!
I guess elharo should also have stipulated that no-one has any avenging friends or relatives (or lawyers) in the entire scenario, and that the usual authorities are going to give a free-pass to any law-breaking today. Maybe also that I’ll forget the whole thing in the morning, so there will be no residual guilt, angst etc.
To be honest, making the wheelchair roll gently into the path of the trolley is now looking very analogous to switching a trolley between two tracks: both seem mechanical and impersonal, with little to tell them apart. I find that I have no strong intuitions any more: my remaining moral intuitions are extremely confused. The scenario is so contrived that I’m feeling no sympathy for anyone, and no real Kantian imperatives either. I might as well be asked whether I want to kill a Martian to save five Venusians. Weird.
EDIT: I have now read your replies to other peoples responses. I see you have already acknowledged the point. Consider this response retracted as redundant.
Flip. Push. (All else being unrealistically equal.)
Pushing is going to get you prosecuted for murder in most jurisdictions,
You are fighting the hypothetical. Note that my response refrained from fighting the hypothetical but did so explicitly and acknowledged the completely absurd nature of the assumption that there are no other consequences to consider. That disclaimer should be sufficient here.
Or suppose you construct the hypothetical so there is no legal comeback: still, why do you want to push a fat man off a bridge?
Because I want to save 5 people.
It takes energy, you could pull a muscle, he could notice and hit back or pull you over too etc. etc.
Again, I chose not to fight the hypothetical. As such I refrained from opting out of answering the moral question by mentioning distracting details that are excluded as considerations by any rigorous introduction to the thought experiment.
Some are utilitarian. Most are consequentialist with some degree of altruistic preference.
Flip. Push. (All else being unrealistically equal.)
No. I don’t want to kill myself. I would rather the victims of the psychopath lived than died, all else being equal. But I care about my own life more than 5 unknown strangers. The revealed preferences of the overwhelming majority of other humans is similar. The only way this question is ‘hard’ is that it could take some effort to come up with answers that sound virtuous.
I’m not a utilitarian. I care more about my life than about the overwhelming majority of combinations of 5 other people. There are exceptions. People I like or admire and people who are instrumentally useful for contributing to the other altruistic causes I care for. Those are the groups of 5 that I would be willing to sacrifice myself for.
No. (And anyone who credibly tried to force that moral onto me or those I cared about could be considered a threat and countered appropriately.)
No. That doesn’t follow. It is also an error to equivocate between “I would push the fat man” and “I concede that I have a moral obligation to push the fat man”.
Exactly the same way you would justify sacrificing someone else if you would sacrifice yourself.
Defend them? Heck no. I may share my answers with someone who is curious. But defending them would imply that my decision to not commit suicide to save strangers somehow requires your permission or agreement.
So you had a specific agenda in mind. I pre-emptively reject whatever demands you are making of me via this style of persuasion and lend my support to anyone else who is morally pressured toward martyrdom.
You know, most people have a point in mind when they start writing something. It’s not some sort of underhanded tactic.
Also, your own life by definition has greater instrumental value than others’ because you can effect it. No non-virtuous sounding preferences required; certainly no trying to go from “revealed preferences” to someone’s terminal values because obviously everyone who claims to be akraisic or, y’know, realizes they were biased and acts to prevent it is just signalling.
Not something I claimed. I re-assert my previous position. I oppose the style of persuasion used in the grandparent. Specifically, the use of a chain of connotatively-fallacious rhetorical questions.
That is:
Not something that follows by definition.
Plainly false as a general claim. There often going to be others that happen to have more instrumental value for achieving many instrumental goals for influencing the universe. For example if someone cares about the survival of humanity a lot (ie. more than about selfish goals) then the life of certain people who are involved in combating existential risk are likely to be more instrumentally useful for said someone than their own.
That’s a lovely assertion and all, but I wasn’t responding to it, sorry. (I didn’t find the questions all that fallacious, myself; just a little sloppy.) Immediately before that statement you said “So you had a specific agenda in mind.”
It was this, and the (perceived?) implications in light of the context, that I meant to reply to. Sorry if that wasn’t clear.
Oh, come on. I didn’t say it was more instrumentally valuable than any conceivable other resource. It has greater instrumental value than other lives. Individual lives may come with additional resources based on the situation.
That’s like responding to the statement “guns aren’t instrumentally useful for avoiding attackers because you’re more likely to injure yourself than an attacker” with “but what if that gun was the only thing standing between a psychopath and hundreds of innocent civilians? What if it was a brilliant futuristic gun that knew not to fire unless it was pointing at a certified Bad Person? It would be useful then!”
If someone says something that sounds obviously wrong, maybe stop and consider that you might be misinterpreting it? Principle of charity and all that.
(I really hope I don’t turn out to have misinterpreted you, that would be too ironic.)
A complementary explanation to the ones I have already given you is that this post is optimised for persuading people like yourself, not people like me. I prefer a state where posts use styles of reasoning more likely to be considered persuasive by people like myself. As such, I oppose this post.
Well, if you phrase it like that …
Why are you against diversity?! We should have posts for both people-like-you and people-like-me! Stop trying to monopolise LessWrong, people-like-wedrifrid!!
EDIT: This has been a joke. We now return you to your regularly scheduled LessWrong.
If exerting influence is indistinguishable from trying to monopolize the community, then I reluctantly endorse trying to monopolize the community.
Sorry, I wasn’t actually being serious. I’ll edit my comment to make that clearer.
I’m a bit puzzled by this. If you care about yourself more than the five strangers, then why push or flip?
Pushing is going to get you prosecuted for murder in most jurisdictions, and could even attract a death sentence in some of them. Flipping is less clear: you could get a manslaughter charge, or be sued by the family of the one tied to the alternate track. The five you saved might decide to contribute to your defence fund, but good luck with that.
Or suppose you construct the hypothetical so there is no legal comeback: still, why do you want to push a fat man off a bridge? It takes energy, you could pull a muscle, he could notice and hit back or pull you over too etc. etc.
You’re being either really blind or deliberately obtuse. Caring more about your life than the life of five strangers doesn’t mean you care infinitely more about yourself than you do about them. Maybe you’ll pull a muscle flipping the switch? it’s entirely legitimate to say that you’ll take some costs upon yourself to do a big favor for 5 strangers without being willing to take the ultimate cost upon yourself.
Apologies. You are quite right, I was indeed being “really blind” and pretty obtuse as well (though not deliberately so). I’ve now spotted that the original poster explicitly said to ignore alll chances that the fat man would fight back, and presumably that extends to other external costs, such as retaliation by his relatives, the law etc. My bad.
I’ve also commented on this further down this thread. I now find my moral intuitions behaving very strangely in this scenario. I strongly suspect that my original intuitions were very closely related to all these knock-on factors which I’ve now been asked to ignore.
No I was pointing out that in all realistic ways of constructing the hypothetical there are going to be quite major risks and costs to oneself in pushing the fat man: an obvious one being that he easily could fight back. This may indeed be one of the factors behind different moral intuitions. (We have no instincts about the cost-to-self of flipping a switch: although that could also be very high in the modern world, it takes some thinking to realise it).
For what it’s worth, my own answers are “no flip, no push and no jump” for precisely such reasons: all too risky to self. Though if I had family members or close friends on the lines, I’d react differently. If there were a hundred or a thousand people on the line, I’d probably react differently.
I’m guessing wedrifid isn’t taking that into account because we were explicitly asked not to do that here:
OK, my bad.
Thanks for the patient reminder to read the entire original post before jumping into commenting on the comments. I did in fact miss all the caveats about wheelchairs, light rolling, fat man being anaesthetised etc. Doh!
I guess elharo should also have stipulated that no-one has any avenging friends or relatives (or lawyers) in the entire scenario, and that the usual authorities are going to give a free-pass to any law-breaking today. Maybe also that I’ll forget the whole thing in the morning, so there will be no residual guilt, angst etc.
To be honest, making the wheelchair roll gently into the path of the trolley is now looking very analogous to switching a trolley between two tracks: both seem mechanical and impersonal, with little to tell them apart. I find that I have no strong intuitions any more: my remaining moral intuitions are extremely confused. The scenario is so contrived that I’m feeling no sympathy for anyone, and no real Kantian imperatives either. I might as well be asked whether I want to kill a Martian to save five Venusians. Weird.
EDIT: I have now read your replies to other peoples responses. I see you have already acknowledged the point. Consider this response retracted as redundant.
You are fighting the hypothetical. Note that my response refrained from fighting the hypothetical but did so explicitly and acknowledged the completely absurd nature of the assumption that there are no other consequences to consider. That disclaimer should be sufficient here.
Because I want to save 5 people.
Again, I chose not to fight the hypothetical. As such I refrained from opting out of answering the moral question by mentioning distracting details that are excluded as considerations by any rigorous introduction to the thought experiment.