RH: “I have read and considered all of Eliezer’s posts, and still disagree with him on this his grand conclusion. Eliezer, do you think the universe was terribly unlikely and therefore terribly lucky to have coughed up human-like values, rather than some other values?”
yes, it almost certainly was because of the way we evolved. There are two distinct events here:
A species evolves to intelligence with the particular values we have.
Given that a species evolves to intelligence with some particular values, it decides that it likes those values.
1 is an extremely unlikely event. 2 is essentially a certainty.
One might call this “the ethical anthopic argument”
RH: “I have read and considered all of Eliezer’s posts, and still disagree with him on this his grand conclusion. Eliezer, do you think the universe was terribly unlikely and therefore terribly lucky to have coughed up human-like values, rather than some other values?”
yes, it almost certainly was because of the way we evolved. There are two distinct events here:
A species evolves to intelligence with the particular values we have.
Given that a species evolves to intelligence with some particular values, it decides that it likes those values.
1 is an extremely unlikely event. 2 is essentially a certainty.
One might call this “the ethical anthopic argument”