This is a really, really hasty non-sequitur. Eliezer’s morality is probably extremely similar to mine; thus, the world be a much, much better place, even according to my specification, with an AI running Eliezer’s morality as opposed no AI running at all (or, worse, a paperclip maximizer). Eliezer’s morality is absolutely not immoral; it’s my morality +- 1% error, as opposed to some other nonhuman goal structure which would be unimaginably bad on my scale.
This is a really, really hasty non-sequitur. Eliezer’s morality is probably extremely similar to mine; thus, the world be a much, much better place, even according to my specification, with an AI running Eliezer’s morality as opposed no AI running at all (or, worse, a paperclip maximizer). Eliezer’s morality is absolutely not immoral; it’s my morality +- 1% error, as opposed to some other nonhuman goal structure which would be unimaginably bad on my scale.