Anon: “The notion of “morally significant” seems to coincide with sentience.”
Yes; the word “sentience” seems to be just a placeholder meaning “qualifications we’ll figure out later for being thought of as a person.”
Tim: Good point, that people have a very strong bias to associate rights with intelligence; whereas empathy is a better criterion. Problem being that dogs have lots of empathy. Let’s say intelligence and empathy are both necessary but not sufficient.
James: “Shouldn’t this outcome be something the CEV would avoid anyway? If it’s making an AI that wants what we would want, then it should not at the same time be making something we would not want to exist.”
CEV is not a magic “do what I mean” incantation. Even supposing the idea were worked out, before the first AI is built, you probably don’t have a mechanism to implement it.
anon: “It would be a mistake to create a new species that deserves our moral consideration, even if at present we would not give it the moral consideration it deserves.”
Something is missing from that sentence. Whatever you meant, let’s not rule out creating new species. We should, eventually.
Eliezer: Creating new sentient species is frightening. But is creating new non-sentient species less frightening? Any new species you create may out-compete the old and become the dominant lifeform. It would be the big lose to create a non-sentient species that replaced sentient life.
Anon: “The notion of “morally significant” seems to coincide with sentience.”
Yes; the word “sentience” seems to be just a placeholder meaning “qualifications we’ll figure out later for being thought of as a person.”
Tim: Good point, that people have a very strong bias to associate rights with intelligence; whereas empathy is a better criterion. Problem being that dogs have lots of empathy. Let’s say intelligence and empathy are both necessary but not sufficient.
James: “Shouldn’t this outcome be something the CEV would avoid anyway? If it’s making an AI that wants what we would want, then it should not at the same time be making something we would not want to exist.”
CEV is not a magic “do what I mean” incantation. Even supposing the idea were worked out, before the first AI is built, you probably don’t have a mechanism to implement it.
anon: “It would be a mistake to create a new species that deserves our moral consideration, even if at present we would not give it the moral consideration it deserves.”
Something is missing from that sentence. Whatever you meant, let’s not rule out creating new species. We should, eventually.
Eliezer: Creating new sentient species is frightening. But is creating new non-sentient species less frightening? Any new species you create may out-compete the old and become the dominant lifeform. It would be the big lose to create a non-sentient species that replaced sentient life.