As I understand Eliezer’s position, when babyeater-humans say “right”, they actually mean babyeating. They’d need a word like “babysaving” to refer to what’s right.
Morality is what we call the output of a particular algorithm instantiated in human brains. If we instantiated a different algorithm, we’d have a word for its output instead.
I think Eliezer sees translating babyeater word for babyeating as “right” as an error similar to translating their word for babyeaters as “human”.
As I understand Eliezer’s position, when babyeater-humans say “right”, they actually mean babyeating. They’d need a word like “babysaving” to refer to what’s right.
Morality is what we call the output of a particular algorithm instantiated in human brains. If we instantiated a different algorithm, we’d have a word for its output instead.
I think Eliezer sees translating babyeater word for babyeating as “right” as an error similar to translating their word for babyeaters as “human”.
Precisely. So it was luck that we instantiate this algorithm, instead of a different one.