Hmm, I was talking about values. I made a type error when I said “reason to prefer”. “Reason” was equivocating cause and justification. I’ll try to clean up what I meant to ask.
Here goes: Are there morally justified terminal (not instrumental) values, that don’t causally root in the evolutionary history of value instincts? Does morality, ultimately, serve value axioms that are arbitrary and out-of-scope for moral analysis?
Non-example: “happiness” is a reinforcement signal in our wetware. A lot has been said about the ethics of happiness, but in the end they’re describing a thing which might not even exist in a creature with a different evolutionary path.
Hmm. This word-twisting smells of black box. What am I not opening?
Hmm, I was talking about values. I made a type error when I said “reason to prefer”. “Reason” was equivocating cause and justification. I’ll try to clean up what I meant to ask.
Here goes: Are there morally justified terminal (not instrumental) values, that don’t causally root in the evolutionary history of value instincts? Does morality, ultimately, serve value axioms that are arbitrary and out-of-scope for moral analysis?
Non-example: “happiness” is a reinforcement signal in our wetware. A lot has been said about the ethics of happiness, but in the end they’re describing a thing which might not even exist in a creature with a different evolutionary path.
Hmm. This word-twisting smells of black box. What am I not opening?