Thanks.
Is there a name for expected utility maximisation over a consequentialist utility function built from human value? Does “consequentialism” usually imply normal human value, or is it usually a general term?
See http://en.wikipedia.org/wiki/Consequentialism for your last question (it’s a general term).
The answer to your “Is there a name...” question is “no”—AFAIK.
I get the impression that most people around here approach morality from that perspective, it seems like something that ought to have a name.
Thanks.
Is there a name for expected utility maximisation over a consequentialist utility function built from human value? Does “consequentialism” usually imply normal human value, or is it usually a general term?
See http://en.wikipedia.org/wiki/Consequentialism for your last question (it’s a general term).
The answer to your “Is there a name...” question is “no”—AFAIK.
I get the impression that most people around here approach morality from that perspective, it seems like something that ought to have a name.