A moral system that is based on preferences is not equivalent to those preferences. Specifically, a moral system is what you need when preferences contradict, either with other entities (assuming you want your moral system to be societal) or with each other. From my point of view, a moral system should not change from moment to moment, though preferences may and often do. As an example: The rule “Do not Murder” is an attempt to resolve either a societal preference vs individual desires or to impose a more reflective decision-making on the kind of decisions you may make in the heat of the moment (or both). Assuming my desire to live by a moral code is strong, then having a code that prohibits murder will stop me from murdering people in a rage, even though my preferences at that moment are to do so, because my preference over the long term is not to.
Another purpose of a moral system is to off-load thinking to clear moments. You can reflectively and with foresight make general moral precepts that lead to better outcomes that you may not be able to decide on a case by case basis at anything approaching enough speed.
It’s late at night and I’m not sure how clear this is.
First of all, if you desire to follow a moral code which prohibits murder more than you desire to murder, then you do not want to murder, any more than if you desire to buy a candy bar for $1 if want $1 more than you want the candy bar.
Now, consider the class of rules that require maximizing a weighted average or sum of everyone’s preferences. Within that class, ‘do not murder’ is a valid rule, considering that people wish to avoid being murdered and also to live in a world which is in general free from murder. ‘Do not seize kidneys’ is marginally valid. The choice ‘I choose not to donate my kidney’ is valid only if one’s own preference is weighted more highly than the preference of a stranger. The choice ‘I will try to find the person who dropped this, even though I would rather keep it.’ is moral only if the preferences of a stranger are weighted equally or greater to one’s own.
A moral system that is based on preferences is not equivalent to those preferences. Specifically, a moral system is what you need when preferences contradict, either with other entities (assuming you want your moral system to be societal) or with each other. From my point of view, a moral system should not change from moment to moment, though preferences may and often do. As an example: The rule “Do not Murder” is an attempt to resolve either a societal preference vs individual desires or to impose a more reflective decision-making on the kind of decisions you may make in the heat of the moment (or both). Assuming my desire to live by a moral code is strong, then having a code that prohibits murder will stop me from murdering people in a rage, even though my preferences at that moment are to do so, because my preference over the long term is not to.
Another purpose of a moral system is to off-load thinking to clear moments. You can reflectively and with foresight make general moral precepts that lead to better outcomes that you may not be able to decide on a case by case basis at anything approaching enough speed.
It’s late at night and I’m not sure how clear this is.
First of all, if you desire to follow a moral code which prohibits murder more than you desire to murder, then you do not want to murder, any more than if you desire to buy a candy bar for $1 if want $1 more than you want the candy bar.
Now, consider the class of rules that require maximizing a weighted average or sum of everyone’s preferences. Within that class, ‘do not murder’ is a valid rule, considering that people wish to avoid being murdered and also to live in a world which is in general free from murder. ‘Do not seize kidneys’ is marginally valid. The choice ‘I choose not to donate my kidney’ is valid only if one’s own preference is weighted more highly than the preference of a stranger. The choice ‘I will try to find the person who dropped this, even though I would rather keep it.’ is moral only if the preferences of a stranger are weighted equally or greater to one’s own.