As for the first part, I would say that it’s fairly common for an individual and a society to not have perfectly identical values or ethical rules. Should I be saying ‘morals’ for the values of society instead?
I would hope that ethical vegetarians can at least give me the reasons for their boundaries. If they’re not eating meat because they don’t want animals to suffer, they should be able to define how they draw the line where the capacity to suffer begins.
You do bring up a good point—most psychologists would agree that babies go through a period before they become truly ‘self-aware’, and I have a great deal of difficulty conceiving of a human society that would advocate ‘fresh baby meat’ as ethical. Vat-grown human meat, I can see happening eventually. Would you say the weight there more on the side of, ‘This being will, given standard development, gain self-awareness’, or on the side of ‘Other self-aware beings are strongly attached to this being and would suffer emotionally if it died’? The second one seems to be more the way things currently function—farmers remind their kinds not to name the farm animals because they might end up on their plate later. But I think the first one can be more consistently applied, particularly if you have non-human (particularly non-cute) intelligences.
You could put strict statistical definitions around it if you wanted, but the general idea is, ‘infants grow up to be self-aware adults’.
This may not always be true for exotic species. Plenty of species in nature, for example, reproduce by throwing out millions of eggs / spores/ what have you that only a small fraction of which grow up to be adults. Ideally, any sort of rule you’d come up with should be universal, regardless of the form of intelligence.
At some point, some computer programs would have to be considered to be people and have a right to existence. But at what stage of development would that happen?
As for the first part, I would say that it’s fairly common for an individual and a society to not have perfectly identical values or ethical rules. Should I be saying ‘morals’ for the values of society instead?
I would hope that ethical vegetarians can at least give me the reasons for their boundaries. If they’re not eating meat because they don’t want animals to suffer, they should be able to define how they draw the line where the capacity to suffer begins.
You do bring up a good point—most psychologists would agree that babies go through a period before they become truly ‘self-aware’, and I have a great deal of difficulty conceiving of a human society that would advocate ‘fresh baby meat’ as ethical. Vat-grown human meat, I can see happening eventually. Would you say the weight there more on the side of, ‘This being will, given standard development, gain self-awareness’, or on the side of ‘Other self-aware beings are strongly attached to this being and would suffer emotionally if it died’? The second one seems to be more the way things currently function—farmers remind their kinds not to name the farm animals because they might end up on their plate later. But I think the first one can be more consistently applied, particularly if you have non-human (particularly non-cute) intelligences.
‘This being will, given standard development, gain self-awareness’ is a common reason that I missed.
I am partially confused by it, because this notion of “standard development” is not easily defined, like “default” in negotiations.
You could put strict statistical definitions around it if you wanted, but the general idea is, ‘infants grow up to be self-aware adults’.
This may not always be true for exotic species. Plenty of species in nature, for example, reproduce by throwing out millions of eggs / spores/ what have you that only a small fraction of which grow up to be adults. Ideally, any sort of rule you’d come up with should be universal, regardless of the form of intelligence.
At some point, some computer programs would have to be considered to be people and have a right to existence. But at what stage of development would that happen?