If we assume that all ants are copies of each other (they are not, but they are more similar than humans), when all 20 quadrillion ants will have the same moral value as just one ant.
This means that preservation of species is more important than preservation of individual insects and it is closer to our natural moral intuitions.
An interesting suggestion. But bear in mind that ants are an entire family of insects, one containing over 12,000 species, while humans are one (rather recent and thus genetically not-yet-very diverse) species. So morphologically or genetically, two randomly selected ants will be a lot less similar to each other than two random humans are. Mentally, well, they have roughly a millionth as many synapses as us, so there’s going be be some information-theoretic sense in which a human’s neural net pattern contains vastly more individuality than an ant’s.
So while I agree that our natural moral intuitions care very little about the distinctions between individual ants, I suspect that the fact that most of those are too small for us to see without a hand lens is doing a lot of work there. A formicologist more familiar with those small-scale difference might disagree with your intuition.
[Also rather specifically in the case of ants, the survival of the ant-nest, which is the breeding unit for ants, depends mostly on the queen and enough workers to care for her: in a large colony, losing a single worker is about as serious as chipping a fingernail is for a human.]
However, I do agree with your moral intuition that a species deserves separate moral weight, beyond that of the individuals currently comprising it. Whether that represents all the potential future individuals that species extinction would make impossible, or we want to model it separately as some additional species-level moral weight as I suggest at one point above, I think that’s a good element to include in an ethical system design.
For biological systems, I agree. (As I discuss in Parts 1 and 3, I think we have to use different approaches for digital systems, where generating a large number of identical or similar copies of a sapience is trivial.)
If we assume that all ants are copies of each other (they are not, but they are more similar than humans), when all 20 quadrillion ants will have the same moral value as just one ant.
This means that preservation of species is more important than preservation of individual insects and it is closer to our natural moral intuitions.
An interesting suggestion. But bear in mind that ants are an entire family of insects, one containing over 12,000 species, while humans are one (rather recent and thus genetically not-yet-very diverse) species. So morphologically or genetically, two randomly selected ants will be a lot less similar to each other than two random humans are. Mentally, well, they have roughly a millionth as many synapses as us, so there’s going be be some information-theoretic sense in which a human’s neural net pattern contains vastly more individuality than an ant’s.
So while I agree that our natural moral intuitions care very little about the distinctions between individual ants, I suspect that the fact that most of those are too small for us to see without a hand lens is doing a lot of work there. A formicologist more familiar with those small-scale difference might disagree with your intuition.
[Also rather specifically in the case of ants, the survival of the ant-nest, which is the breeding unit for ants, depends mostly on the queen and enough workers to care for her: in a large colony, losing a single worker is about as serious as chipping a fingernail is for a human.]
However, I do agree with your moral intuition that a species deserves separate moral weight, beyond that of the individuals currently comprising it. Whether that represents all the potential future individuals that species extinction would make impossible, or we want to model it separately as some additional species-level moral weight as I suggest at one point above, I think that’s a good element to include in an ethical system design.
I don’t think theres a coherent system where copies with different experiences have a lot less moral worth.
For biological systems, I agree. (As I discuss in Parts 1 and 3, I think we have to use different approaches for digital systems, where generating a large number of identical or similar copies of a sapience is trivial.)