One solution is to say our intuitions are wrong—this isn’t quite right (i.e. a morality can’t be “wrong”) unless our intuitions are internally inconsistent, which I do not think is the problem.
I think that is the fundamental problem with non-utilitarianism. Take the trolley problem, for instance. Out intuitions are that the death of one person is preferable to the deaths of 5, but out intuitions also say we shouldn’t deliberately kill someone. Our intuitions about morality conflict all the time, so we have to decide which intuition is more important.
I think that is the fundamental problem with non-utilitarianism. Take the trolley problem, for instance. Out intuitions are that the death of one person is preferable to the deaths of 5, but out intuitions also say we shouldn’t deliberately kill someone. Our intuitions about morality conflict all the time, so we have to decide which intuition is more important.