>> “Or if “Morality is mere preference!” then why care about human preferences? How is it possible to establish any “ought” at all, in a universe seemingly of mere “is”?
I don’t think it’s possible, but why is that a problem? Can’t all moral statements be rewritten as conditionals? i.e. - “You ought not to murder” → “If you murder someone, we will punish you”.
You might say these conditionals aren’t justified, but what on earth could it mean to say they are or are not justified, other than whether they do or do not eventually fit into a “fixed given” moral scheme? Maybe we do not need to justify our moral preferences in this sense.
Can’t all moral statements be rewritten as conditionals? i.e. - “You ought not to murder” → “If you murder someone, we will punish you”.
Not really. Moral statements need to tell you what to do. The example you gave only helps make predictions. I know murdering will result in my punishment, but unless I know whether being punished is good or bad, this doesn’t tell me whether committing murder is good or bad.
I don’t think it’s possible, but why is that a problem? Can’t all moral statements be rewritten as conditionals? i.e. - “You ought not to murder” → “If you murder someone, we will punish you”.
You might say these conditionals aren’t justified, but what on earth could it mean to say they are or are not justified, other than whether they do or do not eventually fit into a “fixed given” moral scheme? Maybe we do not need to justify our moral preferences in this sense.
Not really. Moral statements need to tell you what to do. The example you gave only helps make predictions. I know murdering will result in my punishment, but unless I know whether being punished is good or bad, this doesn’t tell me whether committing murder is good or bad.