I agree with the first, but not the second sentence. (Although I am not sure what it is supposed to imply. I can imagine being convinced Shrimps are more important, but the bar for evidence is pretty high)
I mean that my end goals point towards a vague prospering of human-like minds, with a special preference for people close to me. It aligns with morality often, but not always. If morality requires I sacrifice things I actually care about for carp, I would discard morality with no hesitation.
I mean that my end goals point towards a vague prospering of human-like minds, with a special preference for people close to me. It aligns with morality often, but not always.
What remains? I think this is basically what I usually think of as my own MoralityMorpheus (it just happens to contain a term with everyone else’s). Are you sacrificing what other people think ‘the right thing to do’ is? What future you think what the right thing to do would have been? What future uplifted Koi think what the right thing to do would have been?
I agree with the first, but not the second sentence. (Although I am not sure what it is supposed to imply. I can imagine being convinced Shrimps are more important, but the bar for evidence is pretty high)
I mean that my end goals point towards a vague prospering of human-like minds, with a special preference for people close to me. It aligns with morality often, but not always. If morality requires I sacrifice things I actually care about for carp, I would discard morality with no hesitation.
What remains? I think this is basically what I usually think of as my own MoralityMorpheus (it just happens to contain a term with everyone else’s). Are you sacrificing what other people think ‘the right thing to do’ is? What future you think what the right thing to do would have been? What future uplifted Koi think what the right thing to do would have been?