You’ve officially given me the best example of the inherent flaw in the utilitarian model of morality. Normally, I use the example of a man who is the sole provider of an arbitrarily large family murdering an old homeless man. Utilitarianism says he should go free. The murder’s family, of size X, will all experience disutility from his imprisonment. Call that Y. The homeless man, literally no one will miss. No family members to gain utility from exacting justice. Therefore, since X*Y > 0, the murderer should go back to providing for his family. I do not believe any rational person would consider that just, moral, or even reasonable.
I’m all for rational evaluations of problems, but rationality does not apply to moral arguments. Morality is an emotional response by its very nature. Rational arguments are fine when we’re comparing large numbers of people. A plan that will save 400 lives vs. a plan that has a 90% chance to save 500 lives. That’s not morality, that’s rationality. It doesn’t truly become about morality until it’s personal. If you could save the lives of 3 people you’ve never met, would you let yourself be tortured? Would you torture someone? Regardless of your answer, it is easier said than done...
P.S. I’m not a psychologist, but I imagine if you had different answers to torturing vs. being tortured, that says something about you. Not sure what…
Normally, I use the example of a man who is the sole provider of an arbitrarily large family murdering an old homeless man. Utilitarianism says he should go free. The murder’s family, of size X, will all experience disutility from his imprisonment. Call that Y. The homeless man, literally no one will miss. No family members to gain utility from exacting justice. Therefore, since X*Y > 0, the murderer should go back to providing for his family. I do not believe any rational person would consider that just, moral, or even reasonable.
Err...effectively legalizing murder of large classes of the population would tend to increase the murder rate, costing far more lives in aggregate, setting aside the dire consequences on social order and cooperation. You should use an example where the repellent recommendation actually increases rather than decreases happiness/welfare.
Well, I could qualify my example, saying surveillance ensures only people who provide zero utility are allowed to be murdered, but as I said, the article makes my point much better, even if it doesn’t mean to. A single speck of dust, even an annoying and slightly painful one, in the eyes of X people NEVER adds up to 50 years of torture for an individual. It doesn’t matter how large you make X, 7 billion, a googolplex, or 13^^^^^^^^41. It’s irrelevant.
Imagine that you find yourself visiting a hypothetical culture that acknowledges two importantly distinct classes of people: masters and slaves. By cultural convention, slaves are understood to have effectively no moral weight; causing their suffering, death, injury etc. is simply a property crime, analogous to vandalism. Slaves and masters are distinguished solely by a visible hereditable trait that you don’t consider in any wayrelevant to their moral weight as people.
Shortly after your arrival, a thousand slaves are rounded up and killed. You, as a properly emotional moral thinker, presumably express your dismay at this, and the natives explain that you needn’t worry; it was just a market correction and the economics of the situation are such that the masters are better off now. You explain in turn that your dismay is not economic in nature; it’s because those slaves have moral weight.
They look at you, puzzled.
How might you go about explaining to them that they’re wrong, and slaves really do have moral weight?
Some time later, you return home, and find yourself entertaining a visitor from another realm who is horrified by the discovery that a million old automobiles have recently been destroyed. You explain that it’s OK, the materials are being recycled to make better products, and he explains in turn that his dismay is because automobiles have moral weight.
How might you go about explaining to him that he’s wrong, and cars really don’t have moral weight?
Yup. But would they argue as Jagan did that “rationality does not apply to moral arguments. Morality is an emotional response by its very nature”? I’m specifically interested in Jagan’s answers to my questions, given that assertion.
I could qualify my example, saying surveillance ensures only people who provide zero utility are allowed to be murdered,
If some people’s lives are worth zero utility, then by definition they are worthless. That’s what “zero utility” means. Did you mean something else? Because it seems to me that nobody is worthless to me in real life, and that’s why your example doesn’t work.
A single speck of dust, even an annoying and slightly painful one, in the eyes of X people NEVER adds up to 50 years of torture for an individual. It doesn’t matter how large you make X, 7 billion, a googolplex, or 13^^^^^^^^41. It’s irrelevant.
And you judge it irrelevant based on what? Considering that scope insensitivity is a known bias in humans, so “instinct” is reliably going to go wrong in this case without mindhacking. Two murders are worse than one murder, two gorups of people with dustspecks in heir eyes are worse than one such group; at what point does this stop being true?
The murder’s family, of size X, will all experience disutility from his imprisonment. Call that Y. The homeless man, literally no one will miss. No family members to gain utility from exacting justice. Therefore, since X*Y > 0, the murderer should go back to providing for his family.
You’re overlooking the disutility to the murdered man. Actually, what you describe is Prudent Predation, a famous objection to egoism, not utilitarianism.
I’m all for rational evaluations of problems, but rationality does not apply to moral arguments. Morality is an emotional response by its very nature. Rational arguments are fine when we’re comparing large numbers of people.
I don’t understand this. Sure, small amounts often have more emotional force (“near mode”) than large ones (“far mode”.) But that doesn’t make it right to let your bias hurt people. OTOH, you said “It doesn’t truly become about morality until it’s personal”, so maybe you mean something unusual when you say “morality”.
I’m not a psychologist, but I imagine if you had different answers to torturing vs. being tortured, that says something about you. Not sure what...
Humans are often unable to conform perfectly to their desires, even when they know what the best choice is. It’s known as “akrasia”. For example, addicts often want to stop taking the drugs. If you couldn’t bring yourself to make that sacrifice, that doesn’t mean you shouldn’t, or that you believe you shouldn’t. (Not saying you think it does, just noting for the record.)
You’ve officially given me the best example of the inherent flaw in the utilitarian model of morality. Normally, I use the example of a man who is the sole provider of an arbitrarily large family murdering an old homeless man. Utilitarianism says he should go free. The murder’s family, of size X, will all experience disutility from his imprisonment. Call that Y. The homeless man, literally no one will miss. No family members to gain utility from exacting justice. Therefore, since X*Y > 0, the murderer should go back to providing for his family. I do not believe any rational person would consider that just, moral, or even reasonable.
I’m all for rational evaluations of problems, but rationality does not apply to moral arguments. Morality is an emotional response by its very nature. Rational arguments are fine when we’re comparing large numbers of people. A plan that will save 400 lives vs. a plan that has a 90% chance to save 500 lives. That’s not morality, that’s rationality. It doesn’t truly become about morality until it’s personal. If you could save the lives of 3 people you’ve never met, would you let yourself be tortured? Would you torture someone? Regardless of your answer, it is easier said than done...
P.S. I’m not a psychologist, but I imagine if you had different answers to torturing vs. being tortured, that says something about you. Not sure what…
Err...effectively legalizing murder of large classes of the population would tend to increase the murder rate, costing far more lives in aggregate, setting aside the dire consequences on social order and cooperation. You should use an example where the repellent recommendation actually increases rather than decreases happiness/welfare.
Well, I could qualify my example, saying surveillance ensures only people who provide zero utility are allowed to be murdered, but as I said, the article makes my point much better, even if it doesn’t mean to. A single speck of dust, even an annoying and slightly painful one, in the eyes of X people NEVER adds up to 50 years of torture for an individual. It doesn’t matter how large you make X, 7 billion, a googolplex, or 13^^^^^^^^41. It’s irrelevant.
Imagine that you find yourself visiting a hypothetical culture that acknowledges two importantly distinct classes of people: masters and slaves. By cultural convention, slaves are understood to have effectively no moral weight; causing their suffering, death, injury etc. is simply a property crime, analogous to vandalism. Slaves and masters are distinguished solely by a visible hereditable trait that you don’t consider in any wayrelevant to their moral weight as people.
Shortly after your arrival, a thousand slaves are rounded up and killed. You, as a properly emotional moral thinker, presumably express your dismay at this, and the natives explain that you needn’t worry; it was just a market correction and the economics of the situation are such that the masters are better off now. You explain in turn that your dismay is not economic in nature; it’s because those slaves have moral weight.
They look at you, puzzled.
How might you go about explaining to them that they’re wrong, and slaves really do have moral weight?
Some time later, you return home, and find yourself entertaining a visitor from another realm who is horrified by the discovery that a million old automobiles have recently been destroyed. You explain that it’s OK, the materials are being recycled to make better products, and he explains in turn that his dismay is because automobiles have moral weight.
How might you go about explaining to him that he’s wrong, and cars really don’t have moral weight?
“You might have been a slave” is imaginable in a way that “you might have been an automobil” is not. See Rawls and Kant.
Yup. But would they argue as Jagan did that “rationality does not apply to moral arguments. Morality is an emotional response by its very nature”? I’m specifically interested in Jagan’s answers to my questions, given that assertion.
If some people’s lives are worth zero utility, then by definition they are worthless. That’s what “zero utility” means. Did you mean something else? Because it seems to me that nobody is worthless to me in real life, and that’s why your example doesn’t work.
And you judge it irrelevant based on what? Considering that scope insensitivity is a known bias in humans, so “instinct” is reliably going to go wrong in this case without mindhacking. Two murders are worse than one murder, two gorups of people with dustspecks in heir eyes are worse than one such group; at what point does this stop being true?
You’re overlooking the disutility to the murdered man. Actually, what you describe is Prudent Predation, a famous objection to egoism, not utilitarianism.
I think you forgot to finish this:
Excellent point about the murdered man, though.
I don’t understand this. Sure, small amounts often have more emotional force (“near mode”) than large ones (“far mode”.) But that doesn’t make it right to let your bias hurt people. OTOH, you said “It doesn’t truly become about morality until it’s personal”, so maybe you mean something unusual when you say “morality”.
Humans are often unable to conform perfectly to their desires, even when they know what the best choice is. It’s known as “akrasia”. For example, addicts often want to stop taking the drugs. If you couldn’t bring yourself to make that sacrifice, that doesn’t mean you shouldn’t, or that you believe you shouldn’t. (Not saying you think it does, just noting for the record.)