4.5.2: Doesn’t that screw up the whole concept of moral responsibility?
Honestly? Well, yeah. Moral responsibility doesn’t exist as a physical object. Moral responsibility—the idea that choosing evil causes you to deserve pain—is fundamentally a human idea that we’ve all adopted for convenience’s sake. (23).
The truth is, there is absolutely nothing you can do that will make you deserve pain. Saddam Hussein doesn’t deserve so much as a stubbed toe. Pain is never a good thing, no matter who it happens to, even Adolf Hitler. Pain is bad; if it’s ultimately meaningful, it’s almost certainly as a negative goal. Nothing any human being can do will flip that sign from negative to positive.
So why do we throw people in jail? To discourage crime. Choosing evil doesn’t make a person deserve anything wrong, but it makes ver targetable, so that if something bad has to happen to someone, it may as well happen to ver. Adolf Hitler, for example, is so targetable that we could shoot him on the off-chance that it would save someone a stubbed toe. There’s never a point where we can morally take pleasure in someone else’s pain. But human society doesn’t require hatred to function—just law.
Besides which, my mind feels a lot cleaner now that I’ve totally renounced all hatred.
It’s pretty hard to argue about this if our moral intuitions disagree. But at least, you should know that most people on LW disagree with you on this intuition.
EDIT: As ArisKatsaris points out, I don’t actually have any source for the “most people on LW disagree with you” bit. I’ve always thought that not wanting harm to come to anyone as an instrumental value was a pretty obvious, standard part of utilitarianism, and 62% of LWers are consequentialist, according to the 2012 survey. The post “Policy Debates Should Not Appear One Sided” is fairly highly regarded, and it esposes a related view, that people don’t deserve harm for their stupidity.
Also, what those people would prefer isn’t nessecarily what our moral system should prefer- humans are petty and short-sighted.
I’ve always thought that not wanting harm to come to anyone as an instrumental value was a pretty obvious, standard part of utilitarianism, and 62% of LWers are consequentialist, according to the 2012 survey.
What do you mean by “utilitarianism”? The word has two different common meanings around here: any type of consequentialism, and the specific type of consequentialism that uses “total happiness” as a utility function. This sentence appears to be designed to confuse the two meanings.
The post “Policy Debates Should Not Appear One Sided” is fairly highly regarded, and it esposes a related view, that people don’t deserve harm for their stupidity.
That is most definitely not the main point of that post.
What do you mean by “utilitarianism”? The word has two different common meanings around here: any type of consequentialism, and the specific type of consequentialism that uses “total happiness” as a utility function. This sentence appears to be designed to confuse the two meanings.
Yeah, my mistake. I’d never run across any other versions of consequentialism apart from utilitarianism (except for Clippy, of course). I suppose caring only for yourself might count? But do you seriously think that the majority of those consequentialists aren’t utilitarian?
It’s a kind of utilitarianism. I’m including act utilitarianism and desire utilitarianism and preference utilitarianism and whatever in utilitarianism.
Yes, I had concluded that EY was anti retribution. Hadn’t concluded that he had carried the day on that point.
Moral responsibility—the idea that choosing evil causes you to deserve pain—is fundamentally a human idea that we’ve all adopted for convenience’s sake. (23).
I don’t think vengeance and retribution are “ideas” that people had to come up with—they’re central moral motivations. “A social preference for which we punish violators” gets at 80% of what morality is about.
Some may disagree about the intuition, but I’d note that even EY had to “renounce” all hatred, which implies to me that he had the impulse for hatred (retribution, in this context) in the first place.
This seems like it has makings of an interesting poll question.
This seems like it has makings of an interesting poll question.
I agree. Let’s do that. You’re consequentialist, right?
I’d phrase my opinion as “I have terminal value for people not suffering, including people who have done something wrong. I acknowledge that sometimes causing suffering might have instrumental value, such as imprisonment for crimes.”
How do you phrase yours? If I were to guess, it would be “I have a terminal value which says that people who have caused suffering should suffer themselves.”
I’ll make a Discussion post about this after I get your refinement of the question?
I place terminal value to retribution (inflicting suffering on the causers of suffering), at least for some of the most egregious cases.
I do not place terminal value to retribution, not even for the most egregious cases (e.g. mass murderers). I acknowledge that sometimes it may have instrumental value.
Perhaps also add a third choice:
I think I place terminal value to retribution, but I would prefer it if I could self-modify so that I wouldn’t.
Yeah, I’m pretty sure I (and most LWers) don’t agree with you on that one, at least in the way you phrased it.
You think they’d prefer that the guy that caused everyone else in the universe to suffer didn’t suffer himself?
Here’s an old Eliezer quote on this:
It’s pretty hard to argue about this if our moral intuitions disagree. But at least, you should know that most people on LW disagree with you on this intuition.
EDIT: As ArisKatsaris points out, I don’t actually have any source for the “most people on LW disagree with you” bit. I’ve always thought that not wanting harm to come to anyone as an instrumental value was a pretty obvious, standard part of utilitarianism, and 62% of LWers are consequentialist, according to the 2012 survey. The post “Policy Debates Should Not Appear One Sided” is fairly highly regarded, and it esposes a related view, that people don’t deserve harm for their stupidity.
Also, what those people would prefer isn’t nessecarily what our moral system should prefer- humans are petty and short-sighted.
What do you mean by “utilitarianism”? The word has two different common meanings around here: any type of consequentialism, and the specific type of consequentialism that uses “total happiness” as a utility function. This sentence appears to be designed to confuse the two meanings.
That is most definitely not the main point of that post.
Yeah, my mistake. I’d never run across any other versions of consequentialism apart from utilitarianism (except for Clippy, of course). I suppose caring only for yourself might count? But do you seriously think that the majority of those consequentialists aren’t utilitarian?
Well, even Eliezer’s version of consequentialism isn’t simple utilitarianism for starters.
It’s a kind of utilitarianism. I’m including act utilitarianism and desire utilitarianism and preference utilitarianism and whatever in utilitarianism.
Ok, what is your definition of “utilitarianism”?
[citation needed]
I edited my comment to include a tiny bit more evidence.
Thank you, that’s a good start.
Yes, I had concluded that EY was anti retribution. Hadn’t concluded that he had carried the day on that point.
I don’t think vengeance and retribution are “ideas” that people had to come up with—they’re central moral motivations. “A social preference for which we punish violators” gets at 80% of what morality is about.
Some may disagree about the intuition, but I’d note that even EY had to “renounce” all hatred, which implies to me that he had the impulse for hatred (retribution, in this context) in the first place.
This seems like it has makings of an interesting poll question.
I agree. Let’s do that. You’re consequentialist, right?
I’d phrase my opinion as “I have terminal value for people not suffering, including people who have done something wrong. I acknowledge that sometimes causing suffering might have instrumental value, such as imprisonment for crimes.”
How do you phrase yours? If I were to guess, it would be “I have a terminal value which says that people who have caused suffering should suffer themselves.”
I’ll make a Discussion post about this after I get your refinement of the question?
I’d suggest the following two phrasings:
I place terminal value to retribution (inflicting suffering on the causers of suffering), at least for some of the most egregious cases.
I do not place terminal value to retribution, not even for the most egregious cases (e.g. mass murderers). I acknowledge that sometimes it may have instrumental value.
Perhaps also add a third choice:
I think I place terminal value to retribution, but I would prefer it if I could self-modify so that I wouldn’t.
I would, all else being equal. Suffering is bad.