I thought there was no way I could ever understand what Eliezer had written, but you’ve provided a clue. Should I translate this:
Morality is about how to save babies, not eat them, everyone knows that and they happen to be right. If we could get past difficulties of the translation, the babyeaters would agree with us about what is moral, we would agree with them about what is babyeating, and we would agree about the physical fact that we find different sorts of logical facts to be compelling.
as this?
Human-morality is about how to save babies, not eat them, everyone knows that and they happen to be right. If we could get past difficulties of the translation, the babyeaters would agree with us about what is human-moral, we would agree with them about what is babyeating-moral, and we would agree about the physical fact that we find different sorts of logical facts to be compelling.
Also, what was especially perplexing, translate:
“What should be done with the universe” invokes a criterion of preference, “should”, which compels humans but not Babyeaters. If you look at the fact that the Babyeaters are out trying to make a different sort of universe [...] They do the babyeating thing, we do the right thing;
as:
“What should be done with the universe” invokes a criterion of preference, “human-should”, which compels humans but not Babyeaters. If you look at the fact that the Babyeaters are out trying to make a different sort of universe [...] They do the babyeating-right thing, we do the human-right thing; ?
I understand and agree with your point that the long list of terminal values that most humans share aren’t the ‘right’ ones because they’re values that humans have. If Omega altered the brain of every human so that we had completely different values, ‘morality’ wouldn’t change.
Therefore, to be perfectly precise, byrnema would have to edit her comment to substitute the long list of values that humans happen to share for the word ‘human’, and the long list of values that Babyeaters happen to share for the word ‘babyeating’.
So yeah, I get why someone who doesn’t want to create this kind of confusion in his interlocutors would avoid saying “human-right” and “human-moral”. The problem is that you’re creating another kind of confusion.
It’s because [long list of terminal values that current humans happen to share]-morality is defined by the long list of terminal values that current humans happen to share. It’s not defined by the list of terminal values that post-Omega humans would happen to have.
Is arithmetic “reserved for” a particular list of axioms or for a token for any list of axioms? Neither. Arithmetic is its axioms and all that can be computed from them.
See I think you miss understanding his response. I mean that is the only way I can interpret it to make sense.
Your insistence that it is not the right interpretation is very odd. I get that you don’t want to trigger peoples cooperation instincts, but thats the only framework in which talking about other beings makes sense.
The morality you are talking about is the human-now-extended morality, (well closer to the less-wrong-now-extended morality) in that it is the morality that results from extending from the values humans currently have. Now you seem to have a categorization that need to categorize your own morality as different from others in order to feel right about imposing it? So you categorize it as simply morality, but your morality is is not necessarily my morality and so that categorization feels iffy to me. Now its certainly closer to mine then to the baby eaters, but I have no proof it is the same. Calling it simply Morality papers over this.
You’re wrong. Despite how much I’d like to have a universal, ultimate, true morality, you can’t create it out of whole cloth by defining it as “what-humans-value”. That’s pretending there’s no reason to look up, because, “Look! It’s right there in front of you. So be sure not to look up.”
I thought there was no way I could ever understand what Eliezer had written, but you’ve provided a clue. Should I translate this:
as this?
Also, what was especially perplexing, translate:
as:
Yes.
Yes!
No. See other replies.
I understand and agree with your point that the long list of terminal values that most humans share aren’t the ‘right’ ones because they’re values that humans have. If Omega altered the brain of every human so that we had completely different values, ‘morality’ wouldn’t change.
Therefore, to be perfectly precise, byrnema would have to edit her comment to substitute the long list of values that humans happen to share for the word ‘human’, and the long list of values that Babyeaters happen to share for the word ‘babyeating’.
So yeah, I get why someone who doesn’t want to create this kind of confusion in his interlocutors would avoid saying “human-right” and “human-moral”. The problem is that you’re creating another kind of confusion.
Is this because morality is reserved for a particular list - the list we currently have—rather than a token for any list that could be had?
It’s because [long list of terminal values that current humans happen to share]-morality is defined by the long list of terminal values that current humans happen to share. It’s not defined by the list of terminal values that post-Omega humans would happen to have.
Is arithmetic “reserved for” a particular list of axioms or for a token for any list of axioms? Neither. Arithmetic is its axioms and all that can be computed from them.
See I think you miss understanding his response. I mean that is the only way I can interpret it to make sense.
Your insistence that it is not the right interpretation is very odd. I get that you don’t want to trigger peoples cooperation instincts, but thats the only framework in which talking about other beings makes sense.
The morality you are talking about is the human-now-extended morality, (well closer to the less-wrong-now-extended morality) in that it is the morality that results from extending from the values humans currently have. Now you seem to have a categorization that need to categorize your own morality as different from others in order to feel right about imposing it? So you categorize it as simply morality, but your morality is is not necessarily my morality and so that categorization feels iffy to me. Now its certainly closer to mine then to the baby eaters, but I have no proof it is the same. Calling it simply Morality papers over this.
You’re wrong. Despite how much I’d like to have a universal, ultimate, true morality, you can’t create it out of whole cloth by defining it as “what-humans-value”. That’s pretending there’s no reason to look up, because, “Look! It’s right there in front of you. So be sure not to look up.”