I like the phrase “precedent utilitarianism”. It sounds to utilitarians like you’re joining their camp, while actually pointing out that you’re taking a long-term view of utility, which they usually refuse to do. The important ingredient is paying attention to incentives, which is really the rational response to most questions about morality. Many choices which seem “fairer”, “more just”, or whose alternatives provoke a disgust response don’t take the long-term view into account. If we go around sacrificing every lonely stranger to the highest benefit of others nearby, no one is safe. It’s a tragedy that all those people are sick and will die if they don’t get help, but we don’t make the world less tragic by sacrificing one to save ten every chance we get.
Actually, we would all be more safe, because we’d be in less danger from organ failure. We are each more likely to be one of the “others nearby” than the “lonely stranger”.
That would be true if they were hunting people down. As stated, people would become more resistant to going to hospitals, which would cause problems that way.
This is an exact instance of the point of the post. It is important to assume they are hunting people down, because that’s the LCPW and the fact that this trolley problem incorporates using someone who shows up at the hospital is entirely an unnecessary contingent fact.
I like the phrase “precedent utilitarianism”. It sounds to utilitarians like you’re joining their camp, while actually pointing out that you’re taking a long-term view of utility, which they usually refuse to do
On what basis would you say it’s the case that utilitarians usually refuse to take a long-term view of utility?
When I’ve argued with people who called themselves utilitarian, they seemed to want to make trade-offs among immediately visible options. I’m not going to try to argue that I have population statistics, or know what the “proper” definition of a utilitarian is. Do you believe that some other terminology or behavior better characterizes those called “utilitarians”?
Well, in my experience people who self identify as utilitarians don’t appear to be any more shortsighted in terms of real life moral quandaries than people who don’t so self identify.
I don’t think it’s the case that utilitarians tend to be shortsighted, just that people in general tend to be; if non-utilitarians tend to choose a less shortsighted action in a constructed moral dilemma, it’s not usually due to consciously taking a long view.
When I was in college, a professional philosopher once visited and gave a seminar, where she raised the traveler-at-a-hospital scenario as an argument against utilitarianism (simply on the basis that killing the traveler defies our moral intuitions.) I responded that realistically, given human nature, if doctors tended to do this, then because people aren’t effective risk assessers, people would tend to avoid hospitals for fear of being harvested, to the point that the practice would probably be doing more harm than good. She had never heard or thought of this argument before, and found it a compelling reason not to harvest the traveler from a utilitarian point of view. So as a non utilitarian, it doesn’t seem that she was any more likely to look at questions of utility from a long view, she was just more willing to let moral intuitions control her decision, which sometimes has the same effect.
And that is an advantage of traditional moral systems—because they have been around for so long, they have had opportunities to be tried and tested in various ways. It won’t give adherents a long-term view, but it can be a similar effect. Think of it as, “I don’t have to think out the consequences of this because other people have thought through similar problems over a thousand years, and came up with a rule that says I should do X.” One would be foolish to totally disregard traditional morality simply because of it’s occasional clash with the modern world. It would be like disregarding a “traditional” gene made by “stupid blind arbitrary evolution” because we think we have a better one made by a smarter system—it might be a good idea to compare anyways.
I tend to agree, but it depends on how something was tested. In “Darwinian Agriculture”, I argue that testing by ability to persist is weaker than testing by competition against alternatives. Trees compete against each other, but forests don’t. Societies often compete and their moral systems probably affect competitive success, but things are complicated by migration between societies, population growth (moral systems that work for bands of relatives may not work as well for modern nations), technological change (cooking pork), etc.
If we go around sacrificing every lonely stranger to the highest benefit of others nearby, no one is safe.
That would make a great movie!
Lonely Stranger
Jason Statham wakes up and realises all his family and friends have been killed by a tornado while he survives through luck and general masculine superiority. Beset upon on all sides by scalpel and tranquiliser wielding doctors he must constantly slaughter all the nearby sick people just to keep himself alive. Meanwhile, a sexy young biologist has been captured by a militant sect of religious Fundamentalists. Will Statham be able to break the imprisoned costar out in time to reveal her secret human organ cloning technology or will civilisation as we know it be destroyed by utilitarianism gone wrong?
I like the phrase “precedent utilitarianism”. It sounds to utilitarians like you’re joining their camp, while actually pointing out that you’re taking a long-term view of utility, which they usually refuse to do. The important ingredient is paying attention to incentives, which is really the rational response to most questions about morality. Many choices which seem “fairer”, “more just”, or whose alternatives provoke a disgust response don’t take the long-term view into account. If we go around sacrificing every lonely stranger to the highest benefit of others nearby, no one is safe. It’s a tragedy that all those people are sick and will die if they don’t get help, but we don’t make the world less tragic by sacrificing one to save ten every chance we get.
Actually, we would all be more safe, because we’d be in less danger from organ failure. We are each more likely to be one of the “others nearby” than the “lonely stranger”.
That would be true if they were hunting people down. As stated, people would become more resistant to going to hospitals, which would cause problems that way.
This is an exact instance of the point of the post. It is important to assume they are hunting people down, because that’s the LCPW and the fact that this trolley problem incorporates using someone who shows up at the hospital is entirely an unnecessary contingent fact.
On what basis would you say it’s the case that utilitarians usually refuse to take a long-term view of utility?
When I’ve argued with people who called themselves utilitarian, they seemed to want to make trade-offs among immediately visible options. I’m not going to try to argue that I have population statistics, or know what the “proper” definition of a utilitarian is. Do you believe that some other terminology or behavior better characterizes those called “utilitarians”?
Well, in my experience people who self identify as utilitarians don’t appear to be any more shortsighted in terms of real life moral quandaries than people who don’t so self identify.
I don’t think it’s the case that utilitarians tend to be shortsighted, just that people in general tend to be; if non-utilitarians tend to choose a less shortsighted action in a constructed moral dilemma, it’s not usually due to consciously taking a long view.
When I was in college, a professional philosopher once visited and gave a seminar, where she raised the traveler-at-a-hospital scenario as an argument against utilitarianism (simply on the basis that killing the traveler defies our moral intuitions.) I responded that realistically, given human nature, if doctors tended to do this, then because people aren’t effective risk assessers, people would tend to avoid hospitals for fear of being harvested, to the point that the practice would probably be doing more harm than good. She had never heard or thought of this argument before, and found it a compelling reason not to harvest the traveler from a utilitarian point of view. So as a non utilitarian, it doesn’t seem that she was any more likely to look at questions of utility from a long view, she was just more willing to let moral intuitions control her decision, which sometimes has the same effect.
And that is an advantage of traditional moral systems—because they have been around for so long, they have had opportunities to be tried and tested in various ways. It won’t give adherents a long-term view, but it can be a similar effect. Think of it as, “I don’t have to think out the consequences of this because other people have thought through similar problems over a thousand years, and came up with a rule that says I should do X.” One would be foolish to totally disregard traditional morality simply because of it’s occasional clash with the modern world. It would be like disregarding a “traditional” gene made by “stupid blind arbitrary evolution” because we think we have a better one made by a smarter system—it might be a good idea to compare anyways.
I tend to agree, but it depends on how something was tested. In “Darwinian Agriculture”, I argue that testing by ability to persist is weaker than testing by competition against alternatives. Trees compete against each other, but forests don’t. Societies often compete and their moral systems probably affect competitive success, but things are complicated by migration between societies, population growth (moral systems that work for bands of relatives may not work as well for modern nations), technological change (cooking pork), etc.
That would make a great movie!
Lonely Stranger
Jason Statham wakes up and realises all his family and friends have been killed by a tornado while he survives through luck and general masculine superiority. Beset upon on all sides by scalpel and tranquiliser wielding doctors he must constantly slaughter all the nearby sick people just to keep himself alive. Meanwhile, a sexy young biologist has been captured by a militant sect of religious Fundamentalists. Will Statham be able to break the imprisoned costar out in time to reveal her secret human organ cloning technology or will civilisation as we know it be destroyed by utilitarianism gone wrong?
What do you think of the definition of “Precedent Utilitarianism” used in the philosophy course module archived at https://links.zephon.org/precedentutilitarianism ?