Really? I was under the impression that there was a strong consensus, at least here on LW, that a sufficiently accurate simulation of consciousness is the moral equivalent of consciousness.
“Sufficiently accurate simulation of consciousness” is a subset of set of things that are artificial minds. You might have a consensus for that class. I don’t think you have an understanding that all minds have the same moral value. Even all minds with a certain level of intelligence.
That’s my understanding as well.… though I would say, rather, that being artificial is not a particularly important attribute towards evaluating the moral status of a consciousness. IOW, an artificial consciousness is a consciousness, and the same moral considerations apply to it as other consciousnesses with the same properties. That said, I also think this whole “a tulpa {is,isn’t} an artificial intelligence” discussion is an excellent example of losing track of referents in favor of manipulating symbols, so I don’t think it matters much in context.
Really? I was under the impression that there was a strong consensus, at least here on LW, that a sufficiently accurate simulation of consciousness is the moral equivalent of consciousness.
“Sufficiently accurate simulation of consciousness” is a subset of set of things that are artificial minds. You might have a consensus for that class. I don’t think you have an understanding that all minds have the same moral value. Even all minds with a certain level of intelligence.
At least for me, personally, the relevant property for moral status is whether it has consciousness.
That’s my understanding as well.… though I would say, rather, that being artificial is not a particularly important attribute towards evaluating the moral status of a consciousness. IOW, an artificial consciousness is a consciousness, and the same moral considerations apply to it as other consciousnesses with the same properties. That said, I also think this whole “a tulpa {is,isn’t} an artificial intelligence” discussion is an excellent example of losing track of referents in favor of manipulating symbols, so I don’t think it matters much in context.