I understand and agree with your point that the long list of terminal values that most humans share aren’t the ‘right’ ones because they’re values that humans have. If Omega altered the brain of every human so that we had completely different values, ‘morality’ wouldn’t change.
Therefore, to be perfectly precise, byrnema would have to edit her comment to substitute the long list of values that humans happen to share for the word ‘human’, and the long list of values that Babyeaters happen to share for the word ‘babyeating’.
So yeah, I get why someone who doesn’t want to create this kind of confusion in his interlocutors would avoid saying “human-right” and “human-moral”. The problem is that you’re creating another kind of confusion.
It’s because [long list of terminal values that current humans happen to share]-morality is defined by the long list of terminal values that current humans happen to share. It’s not defined by the list of terminal values that post-Omega humans would happen to have.
Is arithmetic “reserved for” a particular list of axioms or for a token for any list of axioms? Neither. Arithmetic is its axioms and all that can be computed from them.
See I think you miss understanding his response. I mean that is the only way I can interpret it to make sense.
Your insistence that it is not the right interpretation is very odd. I get that you don’t want to trigger peoples cooperation instincts, but thats the only framework in which talking about other beings makes sense.
The morality you are talking about is the human-now-extended morality, (well closer to the less-wrong-now-extended morality) in that it is the morality that results from extending from the values humans currently have. Now you seem to have a categorization that need to categorize your own morality as different from others in order to feel right about imposing it? So you categorize it as simply morality, but your morality is is not necessarily my morality and so that categorization feels iffy to me. Now its certainly closer to mine then to the baby eaters, but I have no proof it is the same. Calling it simply Morality papers over this.
You’re wrong. Despite how much I’d like to have a universal, ultimate, true morality, you can’t create it out of whole cloth by defining it as “what-humans-value”. That’s pretending there’s no reason to look up, because, “Look! It’s right there in front of you. So be sure not to look up.”
Yes.
Yes!
No. See other replies.
I understand and agree with your point that the long list of terminal values that most humans share aren’t the ‘right’ ones because they’re values that humans have. If Omega altered the brain of every human so that we had completely different values, ‘morality’ wouldn’t change.
Therefore, to be perfectly precise, byrnema would have to edit her comment to substitute the long list of values that humans happen to share for the word ‘human’, and the long list of values that Babyeaters happen to share for the word ‘babyeating’.
So yeah, I get why someone who doesn’t want to create this kind of confusion in his interlocutors would avoid saying “human-right” and “human-moral”. The problem is that you’re creating another kind of confusion.
Is this because morality is reserved for a particular list - the list we currently have—rather than a token for any list that could be had?
It’s because [long list of terminal values that current humans happen to share]-morality is defined by the long list of terminal values that current humans happen to share. It’s not defined by the list of terminal values that post-Omega humans would happen to have.
Is arithmetic “reserved for” a particular list of axioms or for a token for any list of axioms? Neither. Arithmetic is its axioms and all that can be computed from them.
See I think you miss understanding his response. I mean that is the only way I can interpret it to make sense.
Your insistence that it is not the right interpretation is very odd. I get that you don’t want to trigger peoples cooperation instincts, but thats the only framework in which talking about other beings makes sense.
The morality you are talking about is the human-now-extended morality, (well closer to the less-wrong-now-extended morality) in that it is the morality that results from extending from the values humans currently have. Now you seem to have a categorization that need to categorize your own morality as different from others in order to feel right about imposing it? So you categorize it as simply morality, but your morality is is not necessarily my morality and so that categorization feels iffy to me. Now its certainly closer to mine then to the baby eaters, but I have no proof it is the same. Calling it simply Morality papers over this.
You’re wrong. Despite how much I’d like to have a universal, ultimate, true morality, you can’t create it out of whole cloth by defining it as “what-humans-value”. That’s pretending there’s no reason to look up, because, “Look! It’s right there in front of you. So be sure not to look up.”