The salient analogy for me is if animals (as in bigger mammals, not centrally birds or rats) are morally more like babies or more like characters in a novel. In all three cases, there is no sapient creature yet, and there are at least hypothetical processes of turning them into sapient creatures. For babies, it’s growing up, and it already works. For characters in a novel and animals, it’s respectively instantiating them as AGI-level characters in LLMs and uplifting (in an unclear post-singularity way).
The main difference appears to be status quo, babies are already on track to grow up. While instantiation of characters from a novel or uplifting of animals look more like a free choice, not something that happens by default (unless it’s morally correct to do that; probably not for all characters from all novels, but possibly for at least some animals). So maybe if the modern factory farmed animals were not going to be uplifted (which cryonics would in principle enable, but also AI timelines are short), it’s morally about as fine as writing a novel with tortured characters? Unclear. Like, I’m tentatively going to treat my next cat as potentially a person, since it’s somewhat likely to encounter the singularity.
Woah, woah, slow down. You’re talking about the edge cases but have skipped the simple stuff. It sounds like you think it’s obvious, or that we’re likely to be on the same page, or that it should be inferrable from what you’ve said? But it’s not, so please say it.
Why is growing up so important?
Reading between the lines, are you saying that the only reason that it’s bad for a human baby to be in pain is that it will eventually grow into a sapient adult? If so: (i) most people, including myself, both disagree and find that view morally reprehensible, (ii) the word “sapient” doesn’t have a clear or agreed upon meaning, so plenty of people would say that babies are sentient; if you mean to capture something by the word “sapient” you’ll have to be more specific. If that’s not what you’re saying, then I don’t know why you’re talking about uploading animals instead of talking about how they are right now.
As a more general question, have you ever had a pet?
the word “sapient” doesn’t have a clear or agreed upon meaning, so plenty of people would say that babies are sentient
Human babies and cats are sentient but not sapient. Human children and adults, if not severely mentally disabled, are both sentient and sapient. I think this is the standard usage. A common misusage of “sentient” is to use it in the sense of sapient, saying “lizard people are sentient”, while meaning “lizard people are sapient” (they are sentient as well, but saying that they are sapient is an additional claim with a different meaning, for which it’s better to have a different word).
Sapients are AGI-level sentients, with some buffer for less functional variants (like children). Sapients are centrally people, framed from a more functional standpoint. Some hypothetical AGIs might be functionally sapient without being sentient, able to optimize the world without being people themselves. I think AGI-level LLM characters are not like that.
uploading animals
Uplifting, not uploading. Uploading preserves behavior, uplifting changes behavior by improving intelligence or knowledge, while preserving identity/memory/personality. Uplifting doesn’t imply leaving the biological substrate, though doing both seems natural in this context.
The salient analogy for me is if animals (as in bigger mammals, not centrally birds or rats) are morally more like babies or more like characters in a novel. In all three cases, there is no sapient creature yet, and there are at least hypothetical processes of turning them into sapient creatures. For babies, it’s growing up, and it already works. For characters in a novel and animals, it’s respectively instantiating them as AGI-level characters in LLMs and uplifting (in an unclear post-singularity way).
The main difference appears to be status quo, babies are already on track to grow up. While instantiation of characters from a novel or uplifting of animals look more like a free choice, not something that happens by default (unless it’s morally correct to do that; probably not for all characters from all novels, but possibly for at least some animals). So maybe if the modern factory farmed animals were not going to be uplifted (which cryonics would in principle enable, but also AI timelines are short), it’s morally about as fine as writing a novel with tortured characters? Unclear. Like, I’m tentatively going to treat my next cat as potentially a person, since it’s somewhat likely to encounter the singularity.
Woah, woah, slow down. You’re talking about the edge cases but have skipped the simple stuff. It sounds like you think it’s obvious, or that we’re likely to be on the same page, or that it should be inferrable from what you’ve said? But it’s not, so please say it.
Why is growing up so important?
Reading between the lines, are you saying that the only reason that it’s bad for a human baby to be in pain is that it will eventually grow into a sapient adult? If so: (i) most people, including myself, both disagree and find that view morally reprehensible, (ii) the word “sapient” doesn’t have a clear or agreed upon meaning, so plenty of people would say that babies are sentient; if you mean to capture something by the word “sapient” you’ll have to be more specific. If that’s not what you’re saying, then I don’t know why you’re talking about uploading animals instead of talking about how they are right now.
As a more general question, have you ever had a pet?
Human babies and cats are sentient but not sapient. Human children and adults, if not severely mentally disabled, are both sentient and sapient. I think this is the standard usage. A common misusage of “sentient” is to use it in the sense of sapient, saying “lizard people are sentient”, while meaning “lizard people are sapient” (they are sentient as well, but saying that they are sapient is an additional claim with a different meaning, for which it’s better to have a different word).
Sapients are AGI-level sentients, with some buffer for less functional variants (like children). Sapients are centrally people, framed from a more functional standpoint. Some hypothetical AGIs might be functionally sapient without being sentient, able to optimize the world without being people themselves. I think AGI-level LLM characters are not like that.
Uplifting, not uploading. Uploading preserves behavior, uplifting changes behavior by improving intelligence or knowledge, while preserving identity/memory/personality. Uplifting doesn’t imply leaving the biological substrate, though doing both seems natural in this context.