Moral relativism all the way. I mean something by morality, but it might not be exactly the same as what you mean.
Of course, moral relativism doesn’t single out anything (like changing other people) that you shouldn’t do, contrary to occasional usage—it just means you’re doing so for your own reasons.
Nor does it mean that humans can’t share pretty much all their algorithms for finding goals, due to a common heritage. And this would make humans capable of remarkable agreement about morality. But to call that an objective morality would be stretching it.
Moral relativism all the way. I mean something by morality, but it might not be exactly the same as what you mean.
Of course, moral relativism doesn’t single out anything (like changing other people) that you shouldn’t do, contrary to occasional usage—it just means you’re doing so for your own reasons.
Nor does it mean that humans can’t share pretty much all their algorithms for finding goals, due to a common heritage. And this would make humans capable of remarkable agreement about morality. But to call that an objective morality would be stretching it.