Eliezer, I don’t understand how you can say that the “lucky causal history” wasn’t luck, unless you also say “if humans had evolved to eat babies, babyeating would have been right.”
If it wouldn’t have been right even in that event, then it took a stupendous amount of luck for us to evolve in just such a way that we care about things that are right, instead of other things.
As I understand Eliezer’s position, when babyeater-humans say “right”, they actually mean babyeating. They’d need a word like “babysaving” to refer to what’s right.
Morality is what we call the output of a particular algorithm instantiated in human brains. If we instantiated a different algorithm, we’d have a word for its output instead.
I think Eliezer sees translating babyeater word for babyeating as “right” as an error similar to translating their word for babyeaters as “human”.
Eliezer, I don’t understand how you can say that the “lucky causal history” wasn’t luck, unless you also say “if humans had evolved to eat babies, babyeating would have been right.”
If it wouldn’t have been right even in that event, then it took a stupendous amount of luck for us to evolve in just such a way that we care about things that are right, instead of other things.
Either that or there is a shadowy figure.
As I understand Eliezer’s position, when babyeater-humans say “right”, they actually mean babyeating. They’d need a word like “babysaving” to refer to what’s right.
Morality is what we call the output of a particular algorithm instantiated in human brains. If we instantiated a different algorithm, we’d have a word for its output instead.
I think Eliezer sees translating babyeater word for babyeating as “right” as an error similar to translating their word for babyeaters as “human”.
Precisely. So it was luck that we instantiate this algorithm, instead of a different one.