One implication of adopting a utilitarian framework as a
normative standard in the psychological study of morality
is the inevitable conclusion that the vast majority of people
are often morally wrong. For instance, when presented
with Thomson’s footbridge dilemma, as many as 90% of
people reject the utilitarian response (Mikhail, 2007).
Many philosophers have also rejected utilitarianism, arguing
that it is inadequate in important, morally meaningful
ways, and that it presents an especially impoverished view
of humans as ‘‘locations of utilities [and nothing more]. . .’’
and that ‘‘persons do not count as individuals. . . any more
than individual petrol tanks do in the analysis of the national
consumption of petroleum’’ (Sen & Williams, 1982,
p. 4). For those who endorse utilitarianism, the ubiquitous
discomfort toward its conclusions points to the pessimistic
possibility that human moral judgment is even more prone
to error than many other forms of judgment, and that
attempting to improve the quality of moral judgment will
be a steep uphill battle.
Before drawing those conclusions, it might prove useful
to investigate individuals who are more likely to endorse
utilitarian solutions and perhaps use them as a psychological
prototype of the ‘‘optimal’’ moral judge. What do those
10% of people who are comfortable with the utilitarian
solution to the footbridge dilemma look like? Might these
utilitarians have other psychological characteristics in
common? Recently, consistent with the view that rational
individuals are more likely to endorse utilitarianism (e.g.,
Greene et al., 2001), a variety of researchers have shown
that individuals with higher working memory capacity
and those who are more deliberative thinkers are, indeed,
more likely to approve of utilitarian solutions (Bartels,
2008; Feltz & Cokely, 2008; Moore, Clark, & Kane, 2008).
In fact, one well-defined group of utilitarians likely shares
these characteristics as well—the subset of philosophers
and behavioral scientists who have concluded that utilitarianism
is the proper normative ethical theory.
This seems a reasonable cause for further investigation. And leads me to wonder, what is more likley, that 90% of people are in a very “don’t believe your lying eyes” way wrong about the interpretation of their morality or that 10% of people actually have genuinely different moral intuitions on a particular set of issues? What if philosophers and cognitive scientists and psychopaths just have values that on reflection drift in different ways than other groups or each other (just because they agree on some utilitarian actions dosen’t mean their systematized ethical frameworks are similar on other dimensions). Of course as Vladimir_M points out how one chooses to signal about moral issues and how one actually respond are two different things.
But could it be that perhaps society, rather than experiencing something fitting our accepted grand tale of moral progress (hastened by enlightened elites throughout history), is rather just rationalizing experiencing moral change that reflects raw demographic shifts, economic conditions and the fickle fashions of those in positions of authority on matters of moral arbitration and/or power? Ah, but that robs me of a comforting future with values that are just my own values extrapolated and “fixed”, best not think of this too much then.
Recently psychologists and experimental philosophers have reported findings showing that in some cases ordinary people’s moral intuitions are affected by factors of dubious relevance to the truth of the content of the intuition. Some defend the use of intuition as evidence in ethics by arguing that philosophers are the experts in this area, and philosophers’ moral intuitions are both different from those of ordinary people and more reliable. We conducted two experiments indicating that philosophers and non-philosophers do indeed sometimes have different moral intuitions, but challenging the notion that philosophers have better or more reliable intuitions.
Moral reasoning can have specific psychometric meaning than is inconsistent with lay interpretations of moral reasoning.
″Professor Simon Baron-Cohen suggests that, unlike the combination of both reduced cognitive and affective empathy often seen in those with classic autism, psychopaths are associated with intact cognitive empathy, implying non-diminished awareness of another’s feelings when they hurt someone.[57]
Moral judgment
Psychopaths have been considered notoriously amoral – an absence of, indifference towards, or disregard for moral beliefs. There are few firm data on patterns of moral judgment, however. Studies of developmental level (sophistication) of moral reasoning found all possible results – lower, higher or the same as non-psychopaths. Studies that compared judgments of personal moral transgressions versus judgments of breaking conventional rules or laws, found that psychopaths rated them as equally severe, whereas non-psychopaths rated the rule-breaking as less severe.[58]
A study comparing judgments of whether personal or impersonal harm would be endorsed in order to achieve the rationally maximum (utilitarian) amount of welfare, found no significant differences between psychopaths and non-psychopaths. However, a further study using the same tests found that prisoners scoring high on the PCL were more likely to endorse impersonal harm or rule violations than non-psychopaths were. Psychopaths who scored low in anxiety were also more willing to endorse personal harm on average.[58]
Assessing accidents, where one person harmed another unintentionally, psychopaths judged such actions to be more morally permissible. This result is perhaps a reflection of psychopaths’ failure to appreciate the emotional aspect of the victim’s harmful experience, and furnishes direct evidence of abnormal moral judgment in psychopathy.[59]”—Wikipedia
This seems a reasonable cause for further investigation. And leads me to wonder, what is more likley, that 90% of people are in a very “don’t believe your lying eyes” way wrong about the interpretation of their morality or that 10% of people actually have genuinely different moral intuitions on a particular set of issues? What if philosophers and cognitive scientists and psychopaths just have values that on reflection drift in different ways than other groups or each other (just because they agree on some utilitarian actions dosen’t mean their systematized ethical frameworks are similar on other dimensions). Of course as Vladimir_M points out how one chooses to signal about moral issues and how one actually respond are two different things.
But could it be that perhaps society, rather than experiencing something fitting our accepted grand tale of moral progress (hastened by enlightened elites throughout history), is rather just rationalizing experiencing moral change that reflects raw demographic shifts, economic conditions and the fickle fashions of those in positions of authority on matters of moral arbitration and/or power? Ah, but that robs me of a comforting future with values that are just my own values extrapolated and “fixed”, best not think of this too much then.
Moral Intuitions: Are Philosophers Experts?
Moral reasoning can have specific psychometric meaning than is inconsistent with lay interpretations of moral reasoning.
″Professor Simon Baron-Cohen suggests that, unlike the combination of both reduced cognitive and affective empathy often seen in those with classic autism, psychopaths are associated with intact cognitive empathy, implying non-diminished awareness of another’s feelings when they hurt someone.[57] Moral judgment
Psychopaths have been considered notoriously amoral – an absence of, indifference towards, or disregard for moral beliefs. There are few firm data on patterns of moral judgment, however. Studies of developmental level (sophistication) of moral reasoning found all possible results – lower, higher or the same as non-psychopaths. Studies that compared judgments of personal moral transgressions versus judgments of breaking conventional rules or laws, found that psychopaths rated them as equally severe, whereas non-psychopaths rated the rule-breaking as less severe.[58]
A study comparing judgments of whether personal or impersonal harm would be endorsed in order to achieve the rationally maximum (utilitarian) amount of welfare, found no significant differences between psychopaths and non-psychopaths. However, a further study using the same tests found that prisoners scoring high on the PCL were more likely to endorse impersonal harm or rule violations than non-psychopaths were. Psychopaths who scored low in anxiety were also more willing to endorse personal harm on average.[58]
Assessing accidents, where one person harmed another unintentionally, psychopaths judged such actions to be more morally permissible. This result is perhaps a reflection of psychopaths’ failure to appreciate the emotional aspect of the victim’s harmful experience, and furnishes direct evidence of abnormal moral judgment in psychopathy.[59]”—Wikipedia