Hm, I was also thinking of moral value of children in this context. At least in my perception, important part of the moral value is the potential to become a conscious, self-aware being. In what sense does this potential translate to artificially created beings?
Maybe if in neural network parameter space there’s a subspace of minds with moral value, also points close to this subspace would have moral value?
Conscious and self-aware are not the same thing. All animals (except perhaps for those without nervous systems, like some oysters?) are conscious, but not many have shown signs of self-awareness (such as with the mirror test). I think self-awareness is completely morally irrelevant and only the capacity for qualia / subjective experience matters—“is there something which it is like to be that thing?”
I suspect that all AIs that currently exist are conscious—that is, I suspect there is something which it is like to be, for instance, GPT-3, at least while it is running—and already have moral relevance, but none of them are self-aware. I do not know how to determine if they are suffering or not, though.
Oysters have nervous systems, but not centralized nervous systems. Sponges lack neurons altogether, though they still have some degree of intercellular communication.
Hm, I was also thinking of moral value of children in this context. At least in my perception, important part of the moral value is the potential to become a conscious, self-aware being. In what sense does this potential translate to artificially created beings?
Maybe if in neural network parameter space there’s a subspace of minds with moral value, also points close to this subspace would have moral value?
Conscious and self-aware are not the same thing. All animals (except perhaps for those without nervous systems, like some oysters?) are conscious, but not many have shown signs of self-awareness (such as with the mirror test). I think self-awareness is completely morally irrelevant and only the capacity for qualia / subjective experience matters—“is there something which it is like to be that thing?”
I suspect that all AIs that currently exist are conscious—that is, I suspect there is something which it is like to be, for instance, GPT-3, at least while it is running—and already have moral relevance, but none of them are self-aware. I do not know how to determine if they are suffering or not, though.
Oysters have nervous systems, but not centralized nervous systems. Sponges lack neurons altogether, though they still have some degree of intercellular communication.
Ah! Thanks, I knew there was something about oysters but I couldn’t remember what it was. I didn’t even think about sponges.