Question for Eliezer. If the human race goes extinct without leaving any legacy, then according to you, any nonhuman intelligent agent that might come into existence will be unable to learn about morality?
If your answer is that the nonhuman agent might be able to learn about morality if it is sentient then please define “sentient”. What is it about a paperclip maximizer that makes it nonsenient? What is it about a human that makes it sentient?
Question for Eliezer. If the human race goes extinct without leaving any legacy, then according to you, any nonhuman intelligent agent that might come into existence will be unable to learn about morality?
If your answer is that the nonhuman agent might be able to learn about morality if it is sentient then please define “sentient”. What is it about a paperclip maximizer that makes it nonsenient? What is it about a human that makes it sentient?