@Tom McCabe:
“Beware shutting yourself into a self-justifying memetic loop. If you had been born in 1800, and just recently moved here via time travel, would you have refused to listen to all of our modern anti-slavery arguments, on the grounds that no moral argument by negro-lovers could be taken seriously?”
Generally I think this is a valid point. One shouldn’t lightly accuse a fellow human of being irredeemably morally broken, simply because they disagree with you on any particular conclusion. But in this particular case, I’m willing to take that step. If I know anything at all about morality, then I know murder is wrong.
@Alan Crossman, Roko:
No, I do not think that the moral theory that Eliezer is arguing for is relativism. I am willing to say a paperclip maximizer is an abomination. It is a thing that should not be. Wouldn’t a relativist say that passing moral judgments on a thing as alien as that isn’t meaningful? Don’t we lack a common moral context by which to judge (according to the relativist)?
Let me attempt a summary of Eliezer’s theory:
Morality is real, but it is something that arose here, on this planet, among this species. It is nearly universal among humans and that is good enough. We shouldn’t expect it to be universal among all intelligent beings. Also it is not possible to concisely write down a definition for “should”, any more than it is possible to write a concise general AI program.
@Tom McCabe: “Beware shutting yourself into a self-justifying memetic loop. If you had been born in 1800, and just recently moved here via time travel, would you have refused to listen to all of our modern anti-slavery arguments, on the grounds that no moral argument by negro-lovers could be taken seriously?”
Generally I think this is a valid point. One shouldn’t lightly accuse a fellow human of being irredeemably morally broken, simply because they disagree with you on any particular conclusion. But in this particular case, I’m willing to take that step. If I know anything at all about morality, then I know murder is wrong.
@Alan Crossman, Roko: No, I do not think that the moral theory that Eliezer is arguing for is relativism. I am willing to say a paperclip maximizer is an abomination. It is a thing that should not be. Wouldn’t a relativist say that passing moral judgments on a thing as alien as that isn’t meaningful? Don’t we lack a common moral context by which to judge (according to the relativist)?
Let me attempt a summary of Eliezer’s theory:
Morality is real, but it is something that arose here, on this planet, among this species. It is nearly universal among humans and that is good enough. We shouldn’t expect it to be universal among all intelligent beings. Also it is not possible to concisely write down a definition for “should”, any more than it is possible to write a concise general AI program.