I think an ethical theory that seems to require all agents to reach the same conclusion on what the optimal outcome would be is doomed to failure.
What do you mean by an “ethical theory” here? Do you mean something purely descriptive, that tries to account for that side of human behavour that is to do with ethics? Or something normative, that sets out what a person should do?
Since it’s clear that people express different ideas about ethics from each other, a descriptive theory that said otherwise would be false as a matter of fact. However, normative theories are generally applicable to everyone through no other reason than that they don’t name specific individuals that they are about.
Utilitarian is a normative proposal, not a descriptive theory.
I mean a normative theory (or proposal if you prefer). Utilitarianism clearly fails as a descriptive theory (and I don’t think it’s proponents would generally disagree on that).
A normative theory that proposes everything would be fine if we could all just agree on the optimal outcome isn’t going to be much help in resolving the actual ethical problems facing humanity. It may be true that if we all were perfect altruists the system would be self consistent but we aren’t, I don’t see any realistic way of getting there from here, and I wouldn’t want to anyway (since it would conflict with my actual values).
A useful normative ethics has to work in a world where agents have differing (and sometimes conflicting) ideas of what is an optimal outcome. It has to help us cooperate to our mutual advantage despite imperfectly aligned goals rather than try and fix the problem by forcing the goals into alignment.
Utilitarianism is a theory for what you should do. It presupposes nothing about what anyone else’s ethical driver is. If cooperating with someone with different ethical goals furthers total utility from your perspective, utilitarianism commends it.
What do you mean by an “ethical theory” here? Do you mean something purely descriptive, that tries to account for that side of human behavour that is to do with ethics? Or something normative, that sets out what a person should do?
Since it’s clear that people express different ideas about ethics from each other, a descriptive theory that said otherwise would be false as a matter of fact. However, normative theories are generally applicable to everyone through no other reason than that they don’t name specific individuals that they are about.
Utilitarian is a normative proposal, not a descriptive theory.
I mean a normative theory (or proposal if you prefer). Utilitarianism clearly fails as a descriptive theory (and I don’t think it’s proponents would generally disagree on that).
A normative theory that proposes everything would be fine if we could all just agree on the optimal outcome isn’t going to be much help in resolving the actual ethical problems facing humanity. It may be true that if we all were perfect altruists the system would be self consistent but we aren’t, I don’t see any realistic way of getting there from here, and I wouldn’t want to anyway (since it would conflict with my actual values).
A useful normative ethics has to work in a world where agents have differing (and sometimes conflicting) ideas of what is an optimal outcome. It has to help us cooperate to our mutual advantage despite imperfectly aligned goals rather than try and fix the problem by forcing the goals into alignment.
Utilitarianism is a theory for what you should do. It presupposes nothing about what anyone else’s ethical driver is. If cooperating with someone with different ethical goals furthers total utility from your perspective, utilitarianism commends it.