I mean that my end goals point towards a vague prospering of human-like minds, with a special preference for people close to me. It aligns with morality often, but not always.
What remains? I think this is basically what I usually think of as my own MoralityMorpheus (it just happens to contain a term with everyone else’s). Are you sacrificing what other people think ‘the right thing to do’ is? What future you think what the right thing to do would have been? What future uplifted Koi think what the right thing to do would have been?
What remains? I think this is basically what I usually think of as my own MoralityMorpheus (it just happens to contain a term with everyone else’s). Are you sacrificing what other people think ‘the right thing to do’ is? What future you think what the right thing to do would have been? What future uplifted Koi think what the right thing to do would have been?