Utilitarianism is certainly correct. You can observe this by watching people make decisions under uncertainty. Preferences aren’t merely ordinal.
But yes, doing the math has its own utility cost, so many decisions are better off handled with approximations. This is how you get things like the Allais paradox.
I’m not sure what “moral” means here. The goal of a gene is to copy itself. Ethics isn’t about altruism.
Utilitarianism is certainly correct. You can observe this by watching people make decisions under uncertainty. Preferences aren’t merely ordinal.
But yes, doing the math has its own utility cost, so many decisions are better off handled with approximations. This is how you get things like the Allais paradox.
I’m not sure what “moral” means here. The goal of a gene is to copy itself. Ethics isn’t about altruism.