“Um, full disclosure: I hate moral relativism with a fiery vengeance . . .”
I think, Whaat? Whaaat? Whaaaaat? Is that Eliezer Yudkowsky saying that? Is Eliezer Yudkowsky claiming that moral propositions are in fact properties of the universe itself and not merely of human minds?
The three explanations each of which I’d like to see are that Eliezer Yudkowsky isn’t saying what I think of “moral relativism” as meaning, that Eliezer Yudkowsky no longer believes this, and that what I just read was not actually written by Eliezer Yudkowsky. The idea of me disagreeing with Eliezer Yudkowsky on a major point like this is something I cannot easily fathom.
Um, he’s still emphatically not a moral relativist as usually understood; don’t you remember the metaethics sequence? The conclusion that
(A) morality is essentially anthropomorphic rather than universal,
doesn’t imply that
(B) we should become indifferent to the content of that morality.
ISTM that strict philosophical definitions of moral relativism tend to center on A, but that most of the conversation around moral relativism assumes it means B.
I suppose that knowing what moral relativism actually is would help. A is a conclusion I can easily live with. B is absurd. Metaphysical moral absolutism (“all intelligent beings will tend toward moral behavior”) is something I would believe only with… many decibels of evidence.
So, I’m reading over Creating Friendly AI, when I come across this:
“Um, full disclosure: I hate moral relativism with a fiery vengeance . . .”
I think, Whaat? Whaaat? Whaaaaat? Is that Eliezer Yudkowsky saying that? Is Eliezer Yudkowsky claiming that moral propositions are in fact properties of the universe itself and not merely of human minds?
The three explanations each of which I’d like to see are that Eliezer Yudkowsky isn’t saying what I think of “moral relativism” as meaning, that Eliezer Yudkowsky no longer believes this, and that what I just read was not actually written by Eliezer Yudkowsky. The idea of me disagreeing with Eliezer Yudkowsky on a major point like this is something I cannot easily fathom.
Um, he’s still emphatically not a moral relativist as usually understood; don’t you remember the metaethics sequence? The conclusion that
(A) morality is essentially anthropomorphic rather than universal,
doesn’t imply that
(B) we should become indifferent to the content of that morality.
ISTM that strict philosophical definitions of moral relativism tend to center on A, but that most of the conversation around moral relativism assumes it means B.
I suppose that knowing what moral relativism actually is would help. A is a conclusion I can easily live with. B is absurd. Metaphysical moral absolutism (“all intelligent beings will tend toward moral behavior”) is something I would believe only with… many decibels of evidence.
Eliezer has declared CFAI obsolete. Try CEV, it’s much clearer, though not entirely clear yet.