The assumption that morality boils down to utility is a rather huge assumption :-)
would you agree that, conditional on having a good definition of “action”, we can evaluate “actions” morally?
Conditional on having a good definition of “action” and on having a good definition of “morally”.
you can generalize to uncertain situations simply by applying probability theory
I don’t think so, at least not “simply”. An omniscient being has no risk and no risk aversion, for example.
isn’t of much use to most people
Morality is supposed to be useful for practical purposes. Heated discussions over how many angels can dance on the head of a pin got a pretty bad rap over the last few centuries… :-)
The assumption that morality boils down to utility is a rather huge assumption :-)
It’s not an assumption; it’s a normative statement I choose to endorse. If you have some other system, feel free to endorse that… but then we’ll be discussing morality, and not meta-morality or whatever system originally produced your objection to Jiro’s distinction between good and bad.
on having a good definition of “morally”
Agree.
An omniscient being has no risk and no risk aversion, for example.
Well, it could have risk aversion. It’s just that risk aversion never comes into play during its decision-making process due to its omniscience. Strip away that omniscience, and risk aversion very well might rear its head.
Morality is supposed to be useful for practical purposes. Heated discussions over how many angels can dance on the head of a pin got a pretty bad rap over the last few centuries… :-)
I disagree. Take the following two statements:
Morality, properly formalized, would be useful for practical purposes.
Morality is not currently properly formalized.
There is no contradiction in these two statements.
The assumption that morality boils down to utility is a rather huge assumption :-)
Conditional on having a good definition of “action” and on having a good definition of “morally”.
I don’t think so, at least not “simply”. An omniscient being has no risk and no risk aversion, for example.
Morality is supposed to be useful for practical purposes. Heated discussions over how many angels can dance on the head of a pin got a pretty bad rap over the last few centuries… :-)
It’s not an assumption; it’s a normative statement I choose to endorse. If you have some other system, feel free to endorse that… but then we’ll be discussing morality, and not meta-morality or whatever system originally produced your objection to Jiro’s distinction between good and bad.
Agree.
Well, it could have risk aversion. It’s just that risk aversion never comes into play during its decision-making process due to its omniscience. Strip away that omniscience, and risk aversion very well might rear its head.
I disagree. Take the following two statements:
Morality, properly formalized, would be useful for practical purposes.
Morality is not currently properly formalized.
There is no contradiction in these two statements.
But they have a consequence: Morality currently is not useful for practical purposes.
That’s… an interesting position. Are you willing to live with it? X-)
You can, of course define morality in this particular way, but why would you do that?